Oct 14, 2025 12:47:07 AM org.apache.karaf.main.Main launch INFO: Installing and starting initial bundles Oct 14, 2025 12:47:07 AM org.apache.karaf.main.Main launch INFO: All initial bundles installed and set to start Oct 14, 2025 12:47:07 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.0/lock Oct 14, 2025 12:47:07 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Oct 14, 2025 12:47:07 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2025-10-14T00:47:10,299 | INFO | CM Configuration Updater (Update: pid=org.ops4j.pax.logging) | EventAdminConfigurationNotifier | 5 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.3.0 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2025-10-14T00:47:11,522 | INFO | activator-1-thread-2 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Adding features: odl-jolokia/[11.0.2,11.0.2],odl-openflowplugin-flow-services-rest/[0.20.1,0.20.1],odl-openflowplugin-app-bulk-o-matic/[0.20.1,0.20.1],acc6325d-fd2f-403b-b01e-19f1462d6d47/[0,0.0.0],odl-infrautils-ready/[7.1.7,7.1.7] 2025-10-14T00:47:11,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Changes to perform: 2025-10-14T00:47:11,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Region: root 2025-10-14T00:47:11,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to install: 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.el/jakarta.el-api/3.0.3 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.enterprise/cdi-api/2.0.SP1 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.transaction/javax.transaction-api/1.2 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.17/jar/uber 2025-10-14T00:47:11,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2025-10-14T00:47:11,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Installing bundles: 2025-10-14T00:47:11,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.el/jakarta.el-api/3.0.3 2025-10-14T00:47:11,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.enterprise/cdi-api/2.0.SP1 2025-10-14T00:47:11,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2025-10-14T00:47:11,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.transaction/javax.transaction-api/1.2 2025-10-14T00:47:11,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2025-10-14T00:47:11,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2025-10-14T00:47:11,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2025-10-14T00:47:11,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2025-10-14T00:47:11,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2025-10-14T00:47:11,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.17/jar/uber 2025-10-14T00:47:11,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2025-10-14T00:47:11,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Starting bundles: 2025-10-14T00:47:11,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.url.wrap/2.6.17 2025-10-14T00:47:11,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2025-10-14T00:47:11,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2025-10-14T00:47:11,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-14T00:47:11,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.interceptor-api/1.2.2 2025-10-14T00:47:11,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 2025-10-14T00:47:11,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 2025-10-14T00:47:11,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.osgi.service.jdbc/1.1.0.202212101352 2025-10-14T00:47:11,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 2025-10-14T00:47:11,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 2025-10-14T00:47:11,757 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc/1.5.7 2025-10-14T00:47:11,765 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Done. 2025-10-14T00:47:13,681 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Changes to perform: 2025-10-14T00:47:13,681 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Region: root 2025-10-14T00:47:13,681 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to uninstall: 2025-10-14T00:47:13,681 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-14T00:47:13,681 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to install: 2025-10-14T00:47:13,681 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.checkerframework/checker-qual/3.50.0 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.code.gson/gson/2.13.1 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/guava/33.4.8-jre 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/failureaccess/1.0.3 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.h2database/h2/2.3.232 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.rabbitmq/amqp-client/5.26.0 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/config/1.4.3 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/ssl-config-core_2.13/0.6.1 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-client/1.38.1 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-driver/1.38.1 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-core/4.2.36 2025-10-14T00:47:13,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.36 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.36 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.36 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.36 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-buffer/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-base/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-compression/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http2/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-common/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-handler/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-resolver/4.2.6.Final 2025-10-14T00:47:13,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport/4.2.6.Final 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-classes-epoll/4.2.6.Final 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-epoll/4.2.6.Final/jar/linux-x86_64 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-unix-common/4.2.6.Final 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.javassist/javassist/3.30.2-GA 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.servlet/javax.servlet-api/3.1.0 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.3 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.lz4/lz4-java/1.8.0 2025-10-14T00:47:13,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:net.bytebuddy/byte-buddy/1.17.7 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.agrona/agrona/1.15.2 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.antlr/antlr4-runtime/4.13.2 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2025-10-14T00:47:13,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-collections/commons-collections/3.2.2 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-beanutils/commons-beanutils/1.11.0 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-codec/commons-codec/1.19.0 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-lang3/3.18.0 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-text/1.14.0 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.8 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.8 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.8 2025-10-14T00:47:13,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.8 2025-10-14T00:47:13,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.8 2025-10-14T00:47:13,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.8 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.8 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-osgi/2.15.0 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-scp/2.15.0 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-sftp/2.15.0 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jdt/ecj/3.26.0 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2025-10-14T00:47:13,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2025-10-14T00:47:13,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jline/jline/3.21.0 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jolokia/jolokia-osgi/1.7.2 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jspecify/jspecify/1.0.0 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm/9.8 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-commons/9.8 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-tree/9.8 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-analysis/9.8 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-util/9.8 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-authn-api/0.21.2 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-cert/0.21.2 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.21.2 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.21.2 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-filterchain/0.21.2 2025-10-14T00:47:13,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/repackaged-shiro/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-api/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-api/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.21.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/atomix-storage/11.0.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/blueprint/11.0.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-api/11.0.2 2025-10-14T00:47:13,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-client/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-dom-api/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-mgmt-api/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/eos-dom-akka/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-api/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-journal/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-spi/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/repackaged-pekko/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-raft/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-segmented-journal/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-api/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-clustering-commons/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-common-util/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-distributed-datastore/11.0.2 2025-10-14T00:47:13,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-remoterpc-connector/11.0.2 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.7 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.7 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.7 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-api/7.1.7 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-impl/7.1.7 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.7 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-util/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-crypt-hash/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-encryption-algs/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-key-exchange-algs/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-mac-algs/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-public-key-algs/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-tls-cipher-suite-algs/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6241/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6243/14.0.18 2025-10-14T00:47:13,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6470/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-inet-types/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-yang-types/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7407-ietf-x509-cert-to-name/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7952/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf-monitoring/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8072/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8341/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-datastores/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-origin/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8343/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8344/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8525/14.0.18 2025-10-14T00:47:13,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8526/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8528/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8529/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8639/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8650/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9640/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9641/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9642/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-client/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-common/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-server/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-client/14.0.18 2025-10-14T00:47:13,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-common/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-server/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-client/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-common/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-server/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-api/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-common-api/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-api/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/14.0.18 2025-10-14T00:47:13,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/14.0.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/14.0.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/14.0.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/14.0.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/14.0.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/general-entity/14.0.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.26.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-type-util/14.0.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.26.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.26.18 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/databind/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-dom-api/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-api/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-none/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/rfc5277/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/sal-remote/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-api/9.0.1 2025-10-14T00:47:13,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-common-mdsal/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/odl-device-notification/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-api/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-nb/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-api/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-mdsal/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-spi/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-subscription/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/sal-remote-impl/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/shaded-sshd/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-api/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-http/9.0.1 2025-10-14T00:47:13,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-ssh/9.0.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tcp/9.0.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tls/9.0.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-api/9.0.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-none/9.0.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/9.0.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.3 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.20.1 2025-10-14T00:47:13,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.20.1 2025-10-14T00:47:13,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.20.1 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-api/0.20.1 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-impl/0.20.1 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-shell/0.20.1 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-generator/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-loader/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-model/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-spec/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/concepts/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.17 2025-10-14T00:47:13,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/util/14.0.17 2025-10-14T00:47:13,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-ir/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.17 2025-10-14T00:47:13,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.17 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-war/2.6.17/jar/uber 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-api/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.33 2025-10-14T00:47:13,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.component/1.5.1 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.owasp.encoder/encoder/1.3.1 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.scala-lang/scala-library/2.13.16 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.reactivestreams/reactive-streams/1.0.4 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:tech.pantheon.triemap/triemap/1.3.2 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:net.java.dev.stax-utils/stax-utils/20070216 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:org.lmdbjava/lmdbjava/0.7.0 2025-10-14T00:47:13,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Stopping bundles: 2025-10-14T00:47:13,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 2025-10-14T00:47:13,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-14T00:47:13,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2025-10-14T00:47:13,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 2025-10-14T00:47:13,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 2025-10-14T00:47:13,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2025-10-14T00:47:13,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 2025-10-14T00:47:13,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Uninstalling bundles: 2025-10-14T00:47:13,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-14T00:47:13,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Installing bundles: 2025-10-14T00:47:13,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.checkerframework/checker-qual/3.50.0 2025-10-14T00:47:13,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.code.gson/gson/2.13.1 2025-10-14T00:47:13,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/guava/33.4.8-jre 2025-10-14T00:47:13,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/failureaccess/1.0.3 2025-10-14T00:47:13,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2025-10-14T00:47:13,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.h2database/h2/2.3.232 2025-10-14T00:47:13,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.rabbitmq/amqp-client/5.26.0 2025-10-14T00:47:13,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/config/1.4.3 2025-10-14T00:47:13,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/ssl-config-core_2.13/0.6.1 2025-10-14T00:47:13,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-client/1.38.1 2025-10-14T00:47:13,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-driver/1.38.1 2025-10-14T00:47:13,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-core/4.2.36 2025-10-14T00:47:13,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.36 2025-10-14T00:47:13,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.36 2025-10-14T00:47:13,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.36 2025-10-14T00:47:13,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.36 2025-10-14T00:47:13,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-buffer/4.2.6.Final 2025-10-14T00:47:13,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-base/4.2.6.Final 2025-10-14T00:47:13,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-compression/4.2.6.Final 2025-10-14T00:47:13,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http/4.2.6.Final 2025-10-14T00:47:13,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http2/4.2.6.Final 2025-10-14T00:47:13,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-common/4.2.6.Final 2025-10-14T00:47:13,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-handler/4.2.6.Final 2025-10-14T00:47:13,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-resolver/4.2.6.Final 2025-10-14T00:47:13,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport/4.2.6.Final 2025-10-14T00:47:13,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-classes-epoll/4.2.6.Final 2025-10-14T00:47:13,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-epoll/4.2.6.Final/jar/linux-x86_64 2025-10-14T00:47:13,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-unix-common/4.2.6.Final 2025-10-14T00:47:13,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2025-10-14T00:47:13,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2025-10-14T00:47:13,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2025-10-14T00:47:13,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2025-10-14T00:47:13,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2025-10-14T00:47:13,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.javassist/javassist/3.30.2-GA 2025-10-14T00:47:13,755 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.servlet/javax.servlet-api/3.1.0 2025-10-14T00:47:13,756 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2025-10-14T00:47:13,757 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.3 2025-10-14T00:47:13,757 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.lz4/lz4-java/1.8.0 2025-10-14T00:47:13,759 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:net.bytebuddy/byte-buddy/1.17.7 2025-10-14T00:47:13,772 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.agrona/agrona/1.15.2 2025-10-14T00:47:13,773 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.antlr/antlr4-runtime/4.13.2 2025-10-14T00:47:13,775 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2025-10-14T00:47:13,775 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2025-10-14T00:47:13,776 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2025-10-14T00:47:13,778 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2025-10-14T00:47:13,779 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2025-10-14T00:47:13,780 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2025-10-14T00:47:13,781 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2025-10-14T00:47:13,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2025-10-14T00:47:13,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2025-10-14T00:47:13,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2025-10-14T00:47:13,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2025-10-14T00:47:13,806 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-collections/commons-collections/3.2.2 2025-10-14T00:47:13,808 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-beanutils/commons-beanutils/1.11.0 2025-10-14T00:47:13,810 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-codec/commons-codec/1.19.0 2025-10-14T00:47:13,812 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-lang3/3.18.0 2025-10-14T00:47:13,815 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-text/1.14.0 2025-10-14T00:47:13,816 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2025-10-14T00:47:13,818 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2025-10-14T00:47:13,819 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.8 2025-10-14T00:47:13,820 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.8 2025-10-14T00:47:13,821 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.8 2025-10-14T00:47:13,822 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.8 2025-10-14T00:47:13,823 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.8 2025-10-14T00:47:13,823 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.8 2025-10-14T00:47:13,825 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.8 2025-10-14T00:47:13,825 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.8 2025-10-14T00:47:13,826 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.8 2025-10-14T00:47:13,827 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.8 2025-10-14T00:47:13,828 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.8 2025-10-14T00:47:13,830 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.8 2025-10-14T00:47:13,832 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.8 2025-10-14T00:47:13,833 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.8 2025-10-14T00:47:13,833 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.8 2025-10-14T00:47:13,836 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.8 2025-10-14T00:47:13,837 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.8 2025-10-14T00:47:13,838 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.8 2025-10-14T00:47:13,839 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.8 2025-10-14T00:47:13,840 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.8 2025-10-14T00:47:13,841 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.8 2025-10-14T00:47:13,842 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.8 2025-10-14T00:47:13,842 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.8 2025-10-14T00:47:13,843 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.8 2025-10-14T00:47:13,845 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.8 2025-10-14T00:47:13,846 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.8 2025-10-14T00:47:13,848 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.8 2025-10-14T00:47:13,849 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.8 2025-10-14T00:47:13,850 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.8 2025-10-14T00:47:13,851 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.8 2025-10-14T00:47:13,852 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-osgi/2.15.0 2025-10-14T00:47:13,856 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-scp/2.15.0 2025-10-14T00:47:13,857 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-sftp/2.15.0 2025-10-14T00:47:13,859 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jdt/ecj/3.26.0 2025-10-14T00:47:13,864 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2025-10-14T00:47:13,866 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2025-10-14T00:47:13,866 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2025-10-14T00:47:13,867 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2025-10-14T00:47:13,868 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2025-10-14T00:47:13,869 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2025-10-14T00:47:13,870 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2025-10-14T00:47:13,871 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2025-10-14T00:47:13,872 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2025-10-14T00:47:13,873 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2025-10-14T00:47:13,874 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2025-10-14T00:47:13,876 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2025-10-14T00:47:13,877 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2025-10-14T00:47:13,877 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2025-10-14T00:47:13,878 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2025-10-14T00:47:13,879 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2025-10-14T00:47:13,880 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2025-10-14T00:47:13,881 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2025-10-14T00:47:13,881 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2025-10-14T00:47:13,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2025-10-14T00:47:13,883 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2025-10-14T00:47:13,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2025-10-14T00:47:13,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2025-10-14T00:47:13,890 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2025-10-14T00:47:13,890 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2025-10-14T00:47:13,891 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jline/jline/3.21.0 2025-10-14T00:47:13,893 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jolokia/jolokia-osgi/1.7.2 2025-10-14T00:47:13,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jspecify/jspecify/1.0.0 2025-10-14T00:47:13,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm/9.8 2025-10-14T00:47:13,896 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-commons/9.8 2025-10-14T00:47:13,897 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-tree/9.8 2025-10-14T00:47:13,897 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-analysis/9.8 2025-10-14T00:47:13,898 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-util/9.8 2025-10-14T00:47:13,899 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-authn-api/0.21.2 2025-10-14T00:47:13,899 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-cert/0.21.2 2025-10-14T00:47:13,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.21.2 2025-10-14T00:47:13,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.21.2 2025-10-14T00:47:13,902 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-filterchain/0.21.2 2025-10-14T00:47:13,903 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.21.2 2025-10-14T00:47:13,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.21.2 2025-10-14T00:47:13,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.21.2 2025-10-14T00:47:13,905 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.21.2 2025-10-14T00:47:13,906 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/repackaged-shiro/0.21.2 2025-10-14T00:47:13,908 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro/0.21.2 2025-10-14T00:47:13,910 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.21.2 2025-10-14T00:47:13,911 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.21.2 2025-10-14T00:47:13,912 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-api/0.21.2 2025-10-14T00:47:13,912 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.21.2 2025-10-14T00:47:13,913 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-api/0.21.2 2025-10-14T00:47:13,914 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.21.2 2025-10-14T00:47:13,914 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/atomix-storage/11.0.2 2025-10-14T00:47:13,915 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/blueprint/11.0.2 2025-10-14T00:47:13,916 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-api/11.0.2 2025-10-14T00:47:13,917 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-client/11.0.2 2025-10-14T00:47:13,918 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-dom-api/11.0.2 2025-10-14T00:47:13,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-mgmt-api/11.0.2 2025-10-14T00:47:13,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/eos-dom-akka/11.0.2 2025-10-14T00:47:13,921 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-api/11.0.2 2025-10-14T00:47:13,922 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-journal/11.0.2 2025-10-14T00:47:13,922 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-spi/11.0.2 2025-10-14T00:47:13,923 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/repackaged-pekko/11.0.2 2025-10-14T00:47:13,947 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-raft/11.0.2 2025-10-14T00:47:13,949 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-segmented-journal/11.0.2 2025-10-14T00:47:13,950 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-api/11.0.2 2025-10-14T00:47:13,951 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/11.0.2 2025-10-14T00:47:13,952 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-clustering-commons/11.0.2 2025-10-14T00:47:13,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-common-util/11.0.2 2025-10-14T00:47:13,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-distributed-datastore/11.0.2 2025-10-14T00:47:13,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-remoterpc-connector/11.0.2 2025-10-14T00:47:13,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.7 2025-10-14T00:47:13,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.7 2025-10-14T00:47:13,960 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.7 2025-10-14T00:47:13,960 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-api/7.1.7 2025-10-14T00:47:13,961 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-impl/7.1.7 2025-10-14T00:47:13,962 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.7 2025-10-14T00:47:13,962 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/14.0.18 2025-10-14T00:47:13,964 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-util/14.0.18 2025-10-14T00:47:13,965 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-crypt-hash/14.0.18 2025-10-14T00:47:13,965 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-encryption-algs/14.0.18 2025-10-14T00:47:13,966 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-key-exchange-algs/14.0.18 2025-10-14T00:47:13,967 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-mac-algs/14.0.18 2025-10-14T00:47:13,967 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-public-key-algs/14.0.18 2025-10-14T00:47:13,968 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-tls-cipher-suite-algs/14.0.18 2025-10-14T00:47:13,969 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6241/14.0.18 2025-10-14T00:47:13,971 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6243/14.0.18 2025-10-14T00:47:13,972 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6470/14.0.18 2025-10-14T00:47:13,974 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-inet-types/14.0.18 2025-10-14T00:47:13,974 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-yang-types/14.0.18 2025-10-14T00:47:13,975 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7407-ietf-x509-cert-to-name/14.0.18 2025-10-14T00:47:13,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7952/14.0.18 2025-10-14T00:47:13,977 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf/14.0.18 2025-10-14T00:47:13,977 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf-monitoring/14.0.18 2025-10-14T00:47:13,978 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8072/14.0.18 2025-10-14T00:47:13,979 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8341/14.0.18 2025-10-14T00:47:13,980 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-datastores/14.0.18 2025-10-14T00:47:13,981 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-origin/14.0.18 2025-10-14T00:47:13,982 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8343/14.0.18 2025-10-14T00:47:13,983 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8344/14.0.18 2025-10-14T00:47:13,984 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8525/14.0.18 2025-10-14T00:47:13,985 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8526/14.0.18 2025-10-14T00:47:13,986 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8528/14.0.18 2025-10-14T00:47:13,987 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8529/14.0.18 2025-10-14T00:47:13,988 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8639/14.0.18 2025-10-14T00:47:13,990 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8650/14.0.18 2025-10-14T00:47:13,991 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9640/14.0.18 2025-10-14T00:47:13,992 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9641/14.0.18 2025-10-14T00:47:13,993 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9642/14.0.18 2025-10-14T00:47:13,995 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-client/14.0.18 2025-10-14T00:47:13,996 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-common/14.0.18 2025-10-14T00:47:13,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-server/14.0.18 2025-10-14T00:47:13,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-client/14.0.18 2025-10-14T00:47:13,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-common/14.0.18 2025-10-14T00:47:14,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-server/14.0.18 2025-10-14T00:47:14,001 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-client/14.0.18 2025-10-14T00:47:14,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-common/14.0.18 2025-10-14T00:47:14,004 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-server/14.0.18 2025-10-14T00:47:14,005 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/14.0.18 2025-10-14T00:47:14,006 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-api/14.0.18 2025-10-14T00:47:14,007 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/14.0.18 2025-10-14T00:47:14,007 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-common-api/14.0.18 2025-10-14T00:47:14,008 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-api/14.0.18 2025-10-14T00:47:14,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/14.0.18 2025-10-14T00:47:14,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/14.0.18 2025-10-14T00:47:14,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/14.0.18 2025-10-14T00:47:14,011 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/14.0.18 2025-10-14T00:47:14,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/14.0.18 2025-10-14T00:47:14,013 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/14.0.18 2025-10-14T00:47:14,013 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/14.0.18 2025-10-14T00:47:14,014 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/14.0.18 2025-10-14T00:47:14,015 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/general-entity/14.0.18 2025-10-14T00:47:14,015 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.26.18 2025-10-14T00:47:14,016 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-type-util/14.0.18 2025-10-14T00:47:14,017 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.26.18 2025-10-14T00:47:14,018 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.26.18 2025-10-14T00:47:14,018 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/databind/9.0.1 2025-10-14T00:47:14,019 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-dom-api/9.0.1 2025-10-14T00:47:14,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-api/9.0.1 2025-10-14T00:47:14,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-none/9.0.1 2025-10-14T00:47:14,021 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/9.0.1 2025-10-14T00:47:14,023 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/rfc5277/9.0.1 2025-10-14T00:47:14,024 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/sal-remote/9.0.1 2025-10-14T00:47:14,025 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-api/9.0.1 2025-10-14T00:47:14,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-common-mdsal/9.0.1 2025-10-14T00:47:14,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/odl-device-notification/9.0.1 2025-10-14T00:47:14,027 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-api/9.0.1 2025-10-14T00:47:14,028 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/9.0.1 2025-10-14T00:47:14,029 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-nb/9.0.1 2025-10-14T00:47:14,030 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server/9.0.1 2025-10-14T00:47:14,031 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-api/9.0.1 2025-10-14T00:47:14,032 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/9.0.1 2025-10-14T00:47:14,032 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-mdsal/9.0.1 2025-10-14T00:47:14,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-spi/9.0.1 2025-10-14T00:47:14,035 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-subscription/9.0.1 2025-10-14T00:47:14,035 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/sal-remote-impl/9.0.1 2025-10-14T00:47:14,036 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/shaded-sshd/9.0.1 2025-10-14T00:47:14,041 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-api/9.0.1 2025-10-14T00:47:14,042 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-http/9.0.1 2025-10-14T00:47:14,045 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-ssh/9.0.1 2025-10-14T00:47:14,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tcp/9.0.1 2025-10-14T00:47:14,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tls/9.0.1 2025-10-14T00:47:14,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-api/9.0.1 2025-10-14T00:47:14,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-none/9.0.1 2025-10-14T00:47:14,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/9.0.1 2025-10-14T00:47:14,050 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.3 2025-10-14T00:47:14,050 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.20.1 2025-10-14T00:47:14,054 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.20.1 2025-10-14T00:47:14,056 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.20.1 2025-10-14T00:47:14,057 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.20.1 2025-10-14T00:47:14,057 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.20.1 2025-10-14T00:47:14,059 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.20.1 2025-10-14T00:47:14,059 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.20.1 2025-10-14T00:47:14,061 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.20.1 2025-10-14T00:47:14,062 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.20.1 2025-10-14T00:47:14,062 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.20.1 2025-10-14T00:47:14,063 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.20.1 2025-10-14T00:47:14,064 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.20.1 2025-10-14T00:47:14,065 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.20.1 2025-10-14T00:47:14,065 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.20.1 2025-10-14T00:47:14,066 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.20.1 2025-10-14T00:47:14,069 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.20.1 2025-10-14T00:47:14,070 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.20.1 2025-10-14T00:47:14,075 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.20.1 2025-10-14T00:47:14,076 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.20.1 2025-10-14T00:47:14,102 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.20.1 2025-10-14T00:47:14,107 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.20.1 2025-10-14T00:47:14,110 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.20.1 2025-10-14T00:47:14,111 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.20.1 2025-10-14T00:47:14,112 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.20.1 2025-10-14T00:47:14,113 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.20.1 2025-10-14T00:47:14,122 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.20.1 2025-10-14T00:47:14,125 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.20.1 2025-10-14T00:47:14,126 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.20.1 2025-10-14T00:47:14,126 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-api/0.20.1 2025-10-14T00:47:14,127 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-impl/0.20.1 2025-10-14T00:47:14,128 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-shell/0.20.1 2025-10-14T00:47:14,129 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.17 2025-10-14T00:47:14,129 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.17 2025-10-14T00:47:14,130 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.17 2025-10-14T00:47:14,131 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.17 2025-10-14T00:47:14,132 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-generator/14.0.17 2025-10-14T00:47:14,133 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-loader/14.0.17 2025-10-14T00:47:14,133 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-model/14.0.17 2025-10-14T00:47:14,134 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.17 2025-10-14T00:47:14,135 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.17 2025-10-14T00:47:14,136 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.17 2025-10-14T00:47:14,136 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.17 2025-10-14T00:47:14,137 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-spec/14.0.17 2025-10-14T00:47:14,138 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.17 2025-10-14T00:47:14,138 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/concepts/14.0.17 2025-10-14T00:47:14,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.17 2025-10-14T00:47:14,140 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.17 2025-10-14T00:47:14,140 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.17 2025-10-14T00:47:14,141 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.17 2025-10-14T00:47:14,141 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.17 2025-10-14T00:47:14,142 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.17 2025-10-14T00:47:14,142 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.17 2025-10-14T00:47:14,143 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.17 2025-10-14T00:47:14,144 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.17 2025-10-14T00:47:14,144 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.17 2025-10-14T00:47:14,145 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.17 2025-10-14T00:47:14,145 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.17 2025-10-14T00:47:14,146 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.17 2025-10-14T00:47:14,147 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.17 2025-10-14T00:47:14,147 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.17 2025-10-14T00:47:14,148 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.17 2025-10-14T00:47:14,148 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.17 2025-10-14T00:47:14,149 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.17 2025-10-14T00:47:14,150 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.17 2025-10-14T00:47:14,150 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.17 2025-10-14T00:47:14,151 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/util/14.0.17 2025-10-14T00:47:14,151 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common/14.0.17 2025-10-14T00:47:14,152 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.17 2025-10-14T00:47:14,153 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.17 2025-10-14T00:47:14,154 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.17 2025-10-14T00:47:14,155 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.17 2025-10-14T00:47:14,156 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.17 2025-10-14T00:47:14,156 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.17 2025-10-14T00:47:14,157 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.17 2025-10-14T00:47:14,158 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.17 2025-10-14T00:47:14,158 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.17 2025-10-14T00:47:14,159 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.17 2025-10-14T00:47:14,160 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.17 2025-10-14T00:47:14,161 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.17 2025-10-14T00:47:14,161 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-ir/14.0.17 2025-10-14T00:47:14,162 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.17 2025-10-14T00:47:14,163 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.17 2025-10-14T00:47:14,164 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.17 2025-10-14T00:47:14,165 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.17 2025-10-14T00:47:14,166 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.17 2025-10-14T00:47:14,166 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.17 2025-10-14T00:47:14,167 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.17 2025-10-14T00:47:14,168 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.17 2025-10-14T00:47:14,169 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.17 2025-10-14T00:47:14,170 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.17 2025-10-14T00:47:14,171 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.17 2025-10-14T00:47:14,172 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.17 2025-10-14T00:47:14,172 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.17 2025-10-14T00:47:14,173 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.17 2025-10-14T00:47:14,174 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.17 2025-10-14T00:47:14,175 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-war/2.6.17/jar/uber 2025-10-14T00:47:14,177 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-api/8.0.33 2025-10-14T00:47:14,178 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.33 2025-10-14T00:47:14,179 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.33 2025-10-14T00:47:14,179 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.33 2025-10-14T00:47:14,180 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.33 2025-10-14T00:47:14,181 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.33 2025-10-14T00:47:14,182 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.33 2025-10-14T00:47:14,184 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.33 2025-10-14T00:47:14,185 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.33 2025-10-14T00:47:14,187 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.33 2025-10-14T00:47:14,188 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.33 2025-10-14T00:47:14,189 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.component/1.5.1 2025-10-14T00:47:14,189 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.owasp.encoder/encoder/1.3.1 2025-10-14T00:47:14,190 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.scala-lang/scala-library/2.13.16 2025-10-14T00:47:14,198 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.reactivestreams/reactive-streams/1.0.4 2025-10-14T00:47:14,199 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2025-10-14T00:47:14,199 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:tech.pantheon.triemap/triemap/1.3.2 2025-10-14T00:47:14,200 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:net.java.dev.stax-utils/stax-utils/20070216 2025-10-14T00:47:14,201 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:org.lmdbjava/lmdbjava/0.7.0 2025-10-14T00:47:14,216 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-password-service-config.xml 2025-10-14T00:47:14,248 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/configuration/factory/pekko.conf 2025-10-14T00:47:14,337 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.controller.cluster.datastore.cfg 2025-10-14T00:47:14,347 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0//etc/org.jolokia.osgi.cfg 2025-10-14T00:47:14,348 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg 2025-10-14T00:47:14,348 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/default-openflow-connection-config.xml 2025-10-14T00:47:14,349 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/legacy-openflow-connection-config.xml 2025-10-14T00:47:14,349 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-app-config.xml 2025-10-14T00:47:14,349 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-datastore-config.xml 2025-10-14T00:47:14,350 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/bin/idmtool 2025-10-14T00:47:14,350 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0//etc/org.opendaylight.aaa.filterchain.cfg 2025-10-14T00:47:14,350 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-cert-config.xml 2025-10-14T00:47:14,351 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/jetty-web.xml 2025-10-14T00:47:14,353 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Refreshing bundles: 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 (Attached fragments changed: [org.ops4j.pax.web.pax-web-compatibility-el2/8.0.33]) 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 (Wired to javax.el-api/3.0.3 which is being refreshed) 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 (Wired to javax.enterprise.cdi-api/2.0.0.SP1 which is being refreshed) 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 (Should be wired to: jakarta.servlet-api/4.0.0 (through [org.apache.servicemix.bundles.jasypt/1.9.3.1] osgi.wiring.package; resolution:=optional; filter:="(osgi.wiring.package=javax.servlet)")) 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 (Bundle will be uninstalled) 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 (Wired to org.apache.servicemix.bundles.jasypt/1.9.3.1 which is being refreshed) 2025-10-14T00:47:14,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 (Wired to javax.transaction-api/1.2.0 which is being refreshed) 2025-10-14T00:47:15,044 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Starting bundles: 2025-10-14T00:47:15,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm/9.8.0 2025-10-14T00:47:15,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.tree/9.8.0 2025-10-14T00:47:15,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.tree.analysis/9.8.0 2025-10-14T00:47:15,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.util/9.8.0 2025-10-14T00:47:15,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.quiesce.api/1.0.0 2025-10-14T00:47:15,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.api/1.0.1 2025-10-14T00:47:15,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.commons/9.8.0 2025-10-14T00:47:15,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.proxy/1.1.14 2025-10-14T00:47:15,053 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.core/1.10.3 2025-10-14T00:47:15,192 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2025-10-14T00:47:15,194 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.cm/1.3.2 2025-10-14T00:47:15,210 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.cm/1.3.2 has been started 2025-10-14T00:47:15,211 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.blueprint/4.4.8 2025-10-14T00:47:15,215 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.kar/4.4.8 2025-10-14T00:47:15,217 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.restconf.nb.rfc8040} from /tmp/karaf-0.23.0/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2025-10-14T00:47:15,218 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.wrap/4.4.8 2025-10-14T00:47:15,219 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.jolokia.osgi} from /tmp/karaf-0.23.0/etc/org.jolokia.osgi.cfg 2025-10-14T00:47:15,224 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.features/4.4.8 2025-10-14T00:47:15,235 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.controller.cluster.datastore} from /tmp/karaf-0.23.0/etc/org.opendaylight.controller.cluster.datastore.cfg 2025-10-14T00:47:15,236 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.openflowplugin} from /tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg 2025-10-14T00:47:15,237 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.aaa.filterchain} from /tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg 2025-10-14T00:47:15,241 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.blueprint.api/1.2.0 2025-10-14T00:47:15,242 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.blueprint.core/1.2.0 2025-10-14T00:47:15,244 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.config/4.4.8 2025-10-14T00:47:15,249 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.modules/4.4.8 2025-10-14T00:47:15,254 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.osgi/2.15.0 2025-10-14T00:47:15,255 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.scp/2.15.0 2025-10-14T00:47:15,256 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.sftp/2.15.0 2025-10-14T00:47:15,257 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jline/3.21.0 2025-10-14T00:47:15,257 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.core/4.4.8 2025-10-14T00:47:15,287 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.8 2025-10-14T00:47:15,289 | INFO | features-3-thread-1 | Activator | 120 - org.apache.karaf.shell.core - 4.4.8 | Not starting local console. To activate set karaf.startLocalConsole=true 2025-10-14T00:47:15,313 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.8 has been started 2025-10-14T00:47:15,317 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.ssh/4.4.8 2025-10-14T00:47:15,344 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.8. Missing service: [org.apache.sshd.server.SshServer] 2025-10-14T00:47:15,344 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jdt.core.compiler.batch/3.26.0.v20210609-0549 2025-10-14T00:47:15,355 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.service.core/4.4.8 2025-10-14T00:47:15,366 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.service.core/4.4.8 2025-10-14T00:47:15,366 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.util/9.4.57.v20241219 2025-10-14T00:47:15,367 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.continuation/9.4.57.v20241219 2025-10-14T00:47:15,367 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.diagnostic.core/4.4.8 2025-10-14T00:47:15,372 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.8 2025-10-14T00:47:15,372 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.bundle.core/4.4.8 2025-10-14T00:47:15,389 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.8 2025-10-14T00:47:15,390 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.osgi.service.component/1.5.1.202212101352 2025-10-14T00:47:15,392 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.felix.scr/2.2.6 2025-10-14T00:47:15,397 | INFO | features-3-thread-1 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Starting with globalExtender setting: false 2025-10-14T00:47:15,400 | INFO | features-3-thread-1 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Version = 2.2.6 2025-10-14T00:47:15,412 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.scr.state/4.4.8 2025-10-14T00:47:15,447 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.jmx/9.4.57.v20241219 2025-10-14T00:47:15,449 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.io/9.4.57.v20241219 2025-10-14T00:47:15,449 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.http/9.4.57.v20241219 2025-10-14T00:47:15,450 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.server/9.4.57.v20241219 2025-10-14T00:47:15,450 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.management.server/4.4.8 2025-10-14T00:47:15,457 | INFO | activator-1-thread-1 | Activator | 113 - org.apache.karaf.management.server - 4.4.8 | Setting java.rmi.server.hostname system property to 127.0.0.1 2025-10-14T00:47:15,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.scr.management/4.4.8 2025-10-14T00:47:15,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.command/4.4.8 2025-10-14T00:47:15,496 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-14T00:47:15,497 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-14T00:47:15,498 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-14T00:47:15,498 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.url.war/2.6.17 2025-10-14T00:47:15,506 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.servlets/9.4.57.v20241219 2025-10-14T00:47:15,506 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.api/1.1.5 2025-10-14T00:47:15,506 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.core/1.1.8 2025-10-14T00:47:15,508 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2025-10-14T00:47:15,516 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a] for service with service.id [15] 2025-10-14T00:47:15,517 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a] for service with service.id [39] 2025-10-14T00:47:15,519 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.client/9.4.57.v20241219 2025-10-14T00:47:15,519 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.features.command/4.4.8 2025-10-14T00:47:15,531 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.features.command/4.4.8 2025-10-14T00:47:15,532 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.util.ajax/9.4.57.v20241219 2025-10-14T00:47:15,533 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.xml/9.4.57.v20241219 2025-10-14T00:47:15,533 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.servlet-api/4.0.0 2025-10-14T00:47:15,533 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.websocket-api/1.1.2 2025-10-14T00:47:15,533 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-api/8.0.33 2025-10-14T00:47:15,534 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-spi/8.0.33 2025-10-14T00:47:15,534 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.config.command/4.4.8 2025-10-14T00:47:15,546 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.config.command/4.4.8 2025-10-14T00:47:15,648 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.8 2025-10-14T00:47:15,659 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2025-10-14T00:47:15,660 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61c51734 with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a 2025-10-14T00:47:15,661 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61c51734 with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a 2025-10-14T00:47:15,661 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61c51734 with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a 2025-10-14T00:47:15,663 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61c51734 with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a 2025-10-14T00:47:15,663 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61c51734 with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a 2025-10-14T00:47:15,663 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61c51734 with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a 2025-10-14T00:47:15,664 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61c51734 with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=7fd23fde-08de-4a08-9a3a-cce5055d434a 2025-10-14T00:47:15,665 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 125 - org.apache.sshd.osgi - 2.15.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2025-10-14T00:47:15,675 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.security/9.4.57.v20241219 2025-10-14T00:47:15,771 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.jaas/9.4.57.v20241219 2025-10-14T00:47:15,773 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2025-10-14T00:47:15,773 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-jsp/8.0.33 2025-10-14T00:47:15,774 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-tomcat-common/8.0.33 2025-10-14T00:47:15,774 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.servlet/9.4.57.v20241219 2025-10-14T00:47:15,775 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-jetty/8.0.33 2025-10-14T00:47:15,783 | INFO | features-3-thread-1 | log | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @20781ms to org.eclipse.jetty.util.log.Slf4jLog 2025-10-14T00:47:15,790 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-runtime/8.0.33 2025-10-14T00:47:15,804 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because configuration has changed 2025-10-14T00:47:15,804 | INFO | features-3-thread-1 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2025-10-14T00:47:15,804 | INFO | features-3-thread-1 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Pax Web Runtime started 2025-10-14T00:47:15,804 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.http.core/4.4.8 2025-10-14T00:47:15,806 | INFO | paxweb-config-1-thread-1 (change config) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2025-10-14T00:47:15,819 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.8. Missing service: [org.apache.karaf.http.core.ProxyService] 2025-10-14T00:47:15,819 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.whiteboard/1.2.0 2025-10-14T00:47:15,834 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.kar.core/4.4.8 2025-10-14T00:47:15,845 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-14T00:47:15,845 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Configuring JettyServerController{configuration=d994d8f4-380d-4355-ba90-4e70899f55cc,state=UNCONFIGURED} 2025-10-14T00:47:15,845 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating Jetty server instance using configuration properties. 2025-10-14T00:47:15,846 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.kar.core/4.4.8 2025-10-14T00:47:15,846 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.bundle.blueprintstate/4.4.8 2025-10-14T00:47:15,857 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-extender-whiteboard/8.0.33 2025-10-14T00:47:15,857 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Processing Jetty configuration from files: [etc/jetty.xml] 2025-10-14T00:47:15,858 | INFO | features-3-thread-1 | Activator | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.33 | Starting Pax Web Whiteboard Extender 2025-10-14T00:47:15,878 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-extender-war/8.0.33 2025-10-14T00:47:15,879 | INFO | features-3-thread-1 | Activator | 392 - org.ops4j.pax.web.pax-web-extender-war - 8.0.33 | Configuring WAR extender thread pool. Pool size = 3 2025-10-14T00:47:15,935 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Found configured connector "jetty-default": 0.0.0.0:8181 2025-10-14T00:47:15,935 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Using configured jetty-default@1d84ea63{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2025-10-14T00:47:15,936 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp1873851465]@6fb0b449{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2025-10-14T00:47:15,938 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.instance.core/4.4.8 2025-10-14T00:47:15,947 | INFO | paxweb-config-1-thread-1 (change controller) | JettyFactory | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding JMX support to Jetty server 2025-10-14T00:47:15,971 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-14T00:47:15,971 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting JettyServerController{configuration=d994d8f4-380d-4355-ba90-4e70899f55cc,state=STOPPED} 2025-10-14T00:47:15,971 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Server@2834d525{STOPPED}[9.4.57.v20241219] 2025-10-14T00:47:15,972 | INFO | paxweb-config-1-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.8+9-Ubuntu-0ubuntu122.04.1 2025-10-14T00:47:15,975 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.instance.core/4.4.8 2025-10-14T00:47:15,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.package.core/4.4.8 2025-10-14T00:47:15,983 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.package.core/4.4.8 2025-10-14T00:47:15,983 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.log.core/4.4.8 2025-10-14T00:47:15,990 | INFO | paxweb-config-1-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2025-10-14T00:47:15,990 | INFO | paxweb-config-1-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2025-10-14T00:47:15,991 | INFO | paxweb-config-1-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 600000ms 2025-10-14T00:47:15,992 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.log.core/4.4.8. Missing service: [org.apache.karaf.log.core.LogService] 2025-10-14T00:47:15,993 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-websocket/8.0.33 2025-10-14T00:47:15,996 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.system.core/4.4.8 2025-10-14T00:47:16,000 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.log.core/4.4.8 2025-10-14T00:47:16,007 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.system.core/4.4.8 2025-10-14T00:47:16,008 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.web.core/4.4.8 2025-10-14T00:47:16,017 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.8. Missing service: [org.apache.karaf.web.WebContainerService] 2025-10-14T00:47:16,017 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.commands/4.4.8 2025-10-14T00:47:16,030 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-14T00:47:16,031 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-14T00:47:16,032 | INFO | paxweb-config-1-thread-1 (change controller) | AbstractConnector | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@1d84ea63{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2025-10-14T00:47:16,033 | INFO | paxweb-config-1-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @21032ms 2025-10-14T00:47:16,034 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpService factory 2025-10-14T00:47:16,034 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.table/4.4.8 2025-10-14T00:47:16,036 | INFO | paxweb-config-1-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.8 [105]] 2025-10-14T00:47:16,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.blueprint/11.0.2 2025-10-14T00:47:16,049 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.http.core/4.4.8 2025-10-14T00:47:16,049 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.33 [393]] 2025-10-14T00:47:16,050 | INFO | paxweb-config-1-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.8 [124]] 2025-10-14T00:47:16,051 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.33 [392]] 2025-10-14T00:47:16,054 | INFO | features-3-thread-1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Starting BlueprintBundleTracker 2025-10-14T00:47:16,059 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpServiceRuntime 2025-10-14T00:47:16,064 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2025-10-14T00:47:16,065 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=2} 2025-10-14T00:47:16,065 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.web.core/4.4.8 2025-10-14T00:47:16,065 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2025-10-14T00:47:16,071 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [79] was successfully created 2025-10-14T00:47:16,073 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [78] was successfully created 2025-10-14T00:47:16,073 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.8 [120] was successfully created 2025-10-14T00:47:16,099 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@3f451fed{/,null,STOPPED} 2025-10-14T00:47:16,101 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@3f451fed{/,null,STOPPED} 2025-10-14T00:47:16,356 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.guava.failureaccess/1.0.3 2025-10-14T00:47:16,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.annotation-api/1.3.5 2025-10-14T00:47:16,364 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.guava/33.4.8.jre 2025-10-14T00:47:16,365 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.ietf-type-util/14.0.18 2025-10-14T00:47:16,366 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.concepts/14.0.17 2025-10-14T00:47:16,366 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-common/14.0.17 2025-10-14T00:47:16,366 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-spec/14.0.17 2025-10-14T00:47:16,367 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-reflect/14.0.17 2025-10-14T00:47:16,367 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6991-ietf-inet-types/14.0.18 2025-10-14T00:47:16,367 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.yang-ext/2013.9.7.26_18 2025-10-14T00:47:16,368 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.inventory/0.20.1 2025-10-14T00:47:16,368 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.ietf-topology/2013.10.21.26_18 2025-10-14T00:47:16,369 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.topology/0.20.1 2025-10-14T00:47:16,369 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6991-ietf-yang-types/14.0.18 2025-10-14T00:47:16,369 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.opendaylight-l2-types/2013.8.27.26_18 2025-10-14T00:47:16,370 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.codegen-extensions/14.0.17 2025-10-14T00:47:16,370 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-api/0.20.1 2025-10-14T00:47:16,371 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-base/0.20.1 2025-10-14T00:47:16,372 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | triemap/1.3.2 2025-10-14T00:47:16,372 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.util/14.0.17 2025-10-14T00:47:16,373 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-xpath-api/14.0.17 2025-10-14T00:47:16,373 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-api/14.0.17 2025-10-14T00:47:16,374 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-api/14.0.17 2025-10-14T00:47:16,374 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-spi/14.0.17 2025-10-14T00:47:16,375 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8528-model-api/14.0.17 2025-10-14T00:47:16,375 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8040-model-api/14.0.17 2025-10-14T00:47:16,376 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc7952-model-api/14.0.17 2025-10-14T00:47:16,376 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-ir/14.0.17 2025-10-14T00:47:16,377 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-spi/14.0.17 2025-10-14T00:47:16,378 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-util/14.0.17 2025-10-14T00:47:16,378 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-util/14.0.17 2025-10-14T00:47:16,379 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-impl/14.0.17 2025-10-14T00:47:16,379 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-transform/14.0.17 2025-10-14T00:47:16,379 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.antlr.antlr4-runtime/4.13.2 2025-10-14T00:47:16,380 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-api/14.0.17 2025-10-14T00:47:16,380 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-spi/14.0.17 2025-10-14T00:47:16,381 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-ri/14.0.17 2025-10-14T00:47:16,381 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-api/14.0.17 2025-10-14T00:47:16,381 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-spi/14.0.17 2025-10-14T00:47:16,382 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-reactor/14.0.17 2025-10-14T00:47:16,382 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-rfc7950/14.0.17 2025-10-14T00:47:16,383 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.lang3/3.18.0 2025-10-14T00:47:16,383 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.typesafe.config/1.4.3 2025-10-14T00:47:16,384 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.scala-lang.scala-library/2.13.16.v20250107-233423-VFINAL-3f6bdae 2025-10-14T00:47:16,385 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.typesafe.sslconfig/0.6.1 2025-10-14T00:47:16,385 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.agrona.core/1.15.2 2025-10-14T00:47:16,385 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.aeron.client/1.38.1 2025-10-14T00:47:16,386 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.aeron.driver/1.38.1 2025-10-14T00:47:16,386 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap_file__tmp_karaf-0.23.0_system_org_lmdbjava_lmdbjava_0.7.0_lmdbjava-0.7.0.jar/0.0.0 2025-10-14T00:47:16,387 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | reactive-streams/1.0.4 2025-10-14T00:47:16,387 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.repackaged-pekko/11.0.2 2025-10-14T00:47:16,392 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.core/4.2.36 2025-10-14T00:47:16,393 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.jmx/4.2.36 2025-10-14T00:47:16,395 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.geronimo.specs.geronimo-atinject_1.0_spec/1.2.0 2025-10-14T00:47:16,395 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | lz4-java/1.8.0 2025-10-14T00:47:16,396 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-api/11.0.2 2025-10-14T00:47:16,396 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-spi/11.0.2 2025-10-14T00:47:16,397 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-api/14.0.17 2025-10-14T00:47:16,397 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-spi/14.0.17 2025-10-14T00:47:16,397 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-binfmt/14.0.17 2025-10-14T00:47:16,399 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-service/0.20.1 2025-10-14T00:47:16,400 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8341/14.0.18 2025-10-14T00:47:16,401 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9640/14.0.18 2025-10-14T00:47:16,402 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9641/14.0.18 2025-10-14T00:47:16,402 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.validation.jakarta.validation-api/2.0.2 2025-10-14T00:47:16,403 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.resolver/4.2.6.Final 2025-10-14T00:47:16,403 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport/4.2.6.Final 2025-10-14T00:47:16,404 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport-native-unix-common/4.2.6.Final 2025-10-14T00:47:16,404 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-base/4.2.6.Final 2025-10-14T00:47:16,405 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.handler/4.2.6.Final 2025-10-14T00:47:16,406 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.shaded-sshd/9.0.1 2025-10-14T00:47:16,407 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-common/14.0.18 2025-10-14T00:47:16,407 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-server/14.0.18 2025-10-14T00:47:16,408 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.servlet-api/3.1.0 2025-10-14T00:47:16,409 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.authn-api/0.21.2 2025-10-14T00:47:16,410 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-common-api/14.0.18 2025-10-14T00:47:16,410 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-binding-api/14.0.18 2025-10-14T00:47:16,411 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-common-api/14.0.18 2025-10-14T00:47:16,412 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-dom-api/14.0.18 2025-10-14T00:47:16,412 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.general-entity/14.0.18 2025-10-14T00:47:16,413 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-api/14.0.17 2025-10-14T00:47:16,413 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.gson/2.13.1 2025-10-14T00:47:16,413 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap_file__tmp_karaf-0.23.0_system_net_java_dev_stax-utils_stax-utils_20070216_stax-utils-20070216.jar/0.0.0 2025-10-14T00:47:16,414 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.text/1.14.0 2025-10-14T00:47:16,414 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | checker-qual/3.50.0 2025-10-14T00:47:16,415 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-access-api/11.0.2 2025-10-14T00:47:16,416 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-access-client/11.0.2 2025-10-14T00:47:16,416 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-mgmt-api/11.0.2 2025-10-14T00:47:16,417 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-common-util/11.0.2 2025-10-14T00:47:16,418 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-api/14.0.18 2025-10-14T00:47:16,431 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.odlext-model-api/14.0.17 2025-10-14T00:47:16,431 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-spi/14.0.18 2025-10-14T00:47:16,432 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | net.bytebuddy.byte-buddy/1.17.7 2025-10-14T00:47:16,434 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-model/14.0.17 2025-10-14T00:47:16,435 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-api/14.0.17 2025-10-14T00:47:16,435 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-spi/14.0.17 2025-10-14T00:47:16,435 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-loader/14.0.17 2025-10-14T00:47:16,436 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-dynamic/14.0.17 2025-10-14T00:47:16,441 | INFO | features-3-thread-1 | SimpleBindingDOMCodecFactory | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Binding/DOM Codec enabled 2025-10-14T00:47:16,441 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.odlext-parser-support/14.0.17 2025-10-14T00:47:16,442 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.openconfig-model-api/14.0.17 2025-10-14T00:47:16,443 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.openconfig-parser-support/14.0.17 2025-10-14T00:47:16,443 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6241-model-api/14.0.17 2025-10-14T00:47:16,444 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6241-parser-support/14.0.17 2025-10-14T00:47:16,446 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6536-model-api/14.0.17 2025-10-14T00:47:16,447 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6536-parser-support/14.0.17 2025-10-14T00:47:16,447 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6643-model-api/14.0.17 2025-10-14T00:47:16,448 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6643-parser-support/14.0.17 2025-10-14T00:47:16,448 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc7952-parser-support/14.0.17 2025-10-14T00:47:16,448 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8040-parser-support/14.0.17 2025-10-14T00:47:16,449 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8528-parser-support/14.0.17 2025-10-14T00:47:16,449 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8639-model-api/14.0.17 2025-10-14T00:47:16,449 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8639-parser-support/14.0.17 2025-10-14T00:47:16,452 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8819-model-api/14.0.17 2025-10-14T00:47:16,452 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8819-parser-support/14.0.17 2025-10-14T00:47:16,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-xpath-impl/14.0.17 2025-10-14T00:47:16,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-impl/14.0.17 2025-10-14T00:47:16,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-spi/14.0.17 2025-10-14T00:47:16,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-generator/14.0.17 2025-10-14T00:47:16,468 | INFO | features-3-thread-1 | DefaultBindingRuntimeGenerator | 328 - org.opendaylight.yangtools.binding-generator - 14.0.17 | Binding/YANG type support activated 2025-10-14T00:47:16,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-osgi/14.0.17 2025-10-14T00:47:16,478 | INFO | features-3-thread-1 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activating 2025-10-14T00:47:16,479 | INFO | features-3-thread-1 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activated 2025-10-14T00:47:16,484 | INFO | features-3-thread-1 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime starting 2025-10-14T00:47:16,538 | INFO | features-3-thread-1 | KarafFeaturesSupport | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Will attempt to integrate with Karaf FeaturesService 2025-10-14T00:47:17,304 | INFO | features-3-thread-1 | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 9.0.1 | Netty transport backed by epoll(2) 2025-10-14T00:47:17,625 | INFO | features-3-thread-1 | SharedEffectiveModelContextFactory | 379 - org.opendaylight.yangtools.yang-parser-impl - 14.0.17 | Using weak references 2025-10-14T00:47:19,661 | INFO | features-3-thread-1 | OSGiModuleInfoSnapshotImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | EffectiveModelContext generation 1 activated 2025-10-14T00:47:20,346 | INFO | features-3-thread-1 | OSGiBindingRuntimeContextImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | BindingRuntimeContext generation 1 activated 2025-10-14T00:47:20,347 | INFO | features-3-thread-1 | GlobalBindingRuntimeContext | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Global BindingRuntimeContext generation 1 activated 2025-10-14T00:47:20,347 | INFO | features-3-thread-1 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime started 2025-10-14T00:47:20,348 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-osgi/14.0.17 2025-10-14T00:47:20,355 | INFO | features-3-thread-1 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activating 2025-10-14T00:47:20,376 | INFO | features-3-thread-1 | OSGiBindingDOMCodecServicesImpl | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec generation 1 activated 2025-10-14T00:47:20,376 | INFO | features-3-thread-1 | GlobalBindingDOMCodecServices | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Global Binding/DOM Codec activated with generation 1 2025-10-14T00:47:20,379 | INFO | features-3-thread-1 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activated 2025-10-14T00:47:20,379 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-gson/14.0.17 2025-10-14T00:47:20,380 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | stax2-api/4.2.2 2025-10-14T00:47:20,381 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-xml/14.0.17 2025-10-14T00:47:20,382 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-ri/14.0.17 2025-10-14T00:47:20,384 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-schema-osgi/14.0.18 2025-10-14T00:47:20,392 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | DOM Schema services activated 2025-10-14T00:47:20,393 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | Updating context to generation 1 2025-10-14T00:47:20,394 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-distributed-datastore/11.0.2 2025-10-14T00:47:20,408 | INFO | features-3-thread-1 | FileModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Shard configuration provider started 2025-10-14T00:47:20,408 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding-dom-adapter/14.0.18 2025-10-14T00:47:20,426 | INFO | features-3-thread-1 | OSGiBlockingBindingNormalizer | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter activated 2025-10-14T00:47:20,435 | INFO | features-3-thread-1 | DynamicBindingAdapter | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | 8 DOMService trackers started 2025-10-14T00:47:20,436 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.eos-dom-akka/11.0.2 2025-10-14T00:47:20,438 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.googlecode.json-simple/1.1.1 2025-10-14T00:47:20,439 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6241/14.0.18 2025-10-14T00:47:20,441 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.util/7.1.7 2025-10-14T00:47:20,441 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-binding-api/14.0.18 2025-10-14T00:47:20,441 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.eos-binding-adapter/14.0.18 2025-10-14T00:47:20,443 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.device-ownership-service/0.20.1 2025-10-14T00:47:20,445 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-singleton-api/14.0.18 2025-10-14T00:47:20,445 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.api/0.20.1 2025-10-14T00:47:20,446 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.libraries.liblldp/0.20.1 2025-10-14T00:47:20,446 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 2025-10-14T00:47:20,471 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.NotificationService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService)] 2025-10-14T00:47:20,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 2025-10-14T00:47:20,478 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T00:47:20,480 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-statistics/0.20.1 2025-10-14T00:47:20,480 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-spi/0.20.1 2025-10-14T00:47:20,514 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.extension-api/0.20.1 2025-10-14T00:47:20,518 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin/0.20.1 2025-10-14T00:47:20,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-common-netty/14.0.17 2025-10-14T00:47:20,526 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport-classes-epoll/4.2.6.Final 2025-10-14T00:47:20,527 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.ready-api/7.1.7 2025-10-14T00:47:20,527 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-api/7.1.7 2025-10-14T00:47:20,528 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.util/0.20.1 2025-10-14T00:47:20,529 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.odlparent.bundles-diag/14.1.3 2025-10-14T00:47:20,534 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.ready-impl/7.1.7 2025-10-14T00:47:20,551 | INFO | features-3-thread-1 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | ThreadFactory for SystemReadyService created 2025-10-14T00:47:20,553 | INFO | features-3-thread-1 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2025-10-14T00:47:20,555 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-impl/7.1.7 2025-10-14T00:47:20,556 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos() started... 2025-10-14T00:47:20,569 | INFO | features-3-thread-1 | DiagStatusServiceImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service started 2025-10-14T00:47:20,580 | INFO | features-3-thread-1 | MBeanUtils | 198 - org.opendaylight.infrautils.diagstatus-api - 7.1.7 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2025-10-14T00:47:20,581 | INFO | features-3-thread-1 | DiagStatusServiceMBeanImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service management started 2025-10-14T00:47:20,581 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl/0.20.1 2025-10-14T00:47:20,593 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-binding-spi/14.0.18 2025-10-14T00:47:20,597 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.common/0.20.1 2025-10-14T00:47:20,598 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-singleton-impl/14.0.18 2025-10-14T00:47:20,599 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.impl/0.20.1 2025-10-14T00:47:20,666 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T00:47:20,675 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.impl/0.20.1. Missing service: [org.opendaylight.openflowplugin.api.openflow.statistics.ofpspecific.MessageIntelligenceAgency] 2025-10-14T00:47:20,684 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T00:47:20,689 | INFO | features-3-thread-1 | MessageIntelligenceAgencyImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2025-10-14T00:47:20,689 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.20.1 2025-10-14T00:47:20,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.extension-onf/0.20.1 2025-10-14T00:47:20,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9642/14.0.18 2025-10-14T00:47:20,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.tls-cipher-suite-algs/14.0.18 2025-10-14T00:47:20,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-common/14.0.18 2025-10-14T00:47:20,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-client/14.0.18 2025-10-14T00:47:20,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8343/14.0.18 2025-10-14T00:47:20,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8344/14.0.18 2025-10-14T00:47:20,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.collections/3.2.2 2025-10-14T00:47:20,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.commons-beanutils/1.11.0 2025-10-14T00:47:20,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.owasp.encoder/1.3.1 2025-10-14T00:47:20,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.repackaged-shiro/0.21.2 2025-10-14T00:47:20,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-public-key-algs/14.0.18 2025-10-14T00:47:20,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-cluster-admin-api/11.0.2 2025-10-14T00:47:20,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.util/1.1.3 2025-10-14T00:47:20,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8528/14.0.18 2025-10-14T00:47:20,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8529/14.0.18 2025-10-14T00:47:20,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8040-ietf-restconf/14.0.18 2025-10-14T00:47:20,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8639/14.0.18 2025-10-14T00:47:20,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6243/14.0.18 2025-10-14T00:47:20,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-api/9.0.1 2025-10-14T00:47:20,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-compression/4.2.6.Final 2025-10-14T00:47:20,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-http/4.2.6.Final 2025-10-14T00:47:20,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-http2/4.2.6.Final 2025-10-14T00:47:20,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.commons-codec/1.19.0 2025-10-14T00:47:20,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-api/9.0.1 2025-10-14T00:47:20,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-client/14.0.18 2025-10-14T00:47:20,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-tcp/9.0.1 2025-10-14T00:47:20,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-server/14.0.18 2025-10-14T00:47:20,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-tls/9.0.1 2025-10-14T00:47:20,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.crypt-hash/14.0.18 2025-10-14T00:47:20,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-http/9.0.1 2025-10-14T00:47:20,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc7407-ietf-x509-cert-to-name/14.0.18 2025-10-14T00:47:20,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.draft-ietf-restconf-server/9.0.1 2025-10-14T00:47:20,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-api/0.20.1 2025-10-14T00:47:20,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-client/2.47.0 2025-10-14T00:47:20,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.truststore-none/9.0.1 2025-10-14T00:47:20,755 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.h2database/2.3.232 2025-10-14T00:47:20,778 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.containers.jersey-container-servlet-core/2.47.0 2025-10-14T00:47:20,785 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.servlet-api/0.21.2 2025-10-14T00:47:20,789 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.servlet-jersey2/0.21.2 2025-10-14T00:47:20,797 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8072/14.0.18 2025-10-14T00:47:20,797 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.netconf-api/9.0.1 2025-10-14T00:47:20,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.dom-api/9.0.1 2025-10-14T00:47:20,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.shiro-api/0.21.2 2025-10-14T00:47:20,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-encryption-algs/14.0.18 2025-10-14T00:47:20,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-key-exchange-algs/14.0.18 2025-10-14T00:47:20,800 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-mac-algs/14.0.18 2025-10-14T00:47:20,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-common/14.0.18 2025-10-14T00:47:20,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.rfc5277/9.0.1 2025-10-14T00:47:20,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-fs/14.0.17 2025-10-14T00:47:20,802 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jolokia.osgi/1.7.2 2025-10-14T00:47:20,804 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [155]] 2025-10-14T00:47:20,823 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@1fd28a48,contexts=[{HS,OCM-5,context:407736916,/}]} 2025-10-14T00:47:20,824 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@1fd28a48,contexts=null}", size=3} 2025-10-14T00:47:20,824 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{HS,id=OCM-5,name='context:407736916',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:407736916',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@184d9254}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@3f451fed{/,null,STOPPED} 2025-10-14T00:47:20,825 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@3f451fed{/,null,STOPPED} 2025-10-14T00:47:20,826 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@1fd28a48,contexts=[{HS,OCM-5,context:407736916,/}]} 2025-10-14T00:47:20,829 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:407736916',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:407736916',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@184d9254}} 2025-10-14T00:47:20,847 | INFO | paxweb-config-1-thread-1 | osgi | 155 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2025-10-14T00:47:20,872 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@3f451fed{/,null,AVAILABLE} 2025-10-14T00:47:20,873 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:407736916',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:407736916',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@184d9254}}} as OSGi service for "/" context path 2025-10-14T00:47:20,876 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.password-service-api/0.21.2 2025-10-14T00:47:20,878 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.encrypt-service/0.21.2 2025-10-14T00:47:20,879 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.cert/0.21.2 2025-10-14T00:47:20,884 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-14T00:47:20,884 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6470/14.0.18 2025-10-14T00:47:20,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8342-ietf-datastores/14.0.18 2025-10-14T00:47:20,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc7952/14.0.18 2025-10-14T00:47:20,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8342-ietf-origin/14.0.18 2025-10-14T00:47:20,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8526/14.0.18 2025-10-14T00:47:20,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.encrypt-service-impl/0.21.2 2025-10-14T00:47:20,891 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.databind/9.0.1 2025-10-14T00:47:20,892 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-api/9.0.1 2025-10-14T00:47:20,892 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.netconf-common-mdsal/9.0.1 2025-10-14T00:47:20,893 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-export/14.0.17 2025-10-14T00:47:20,893 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-spi/9.0.1 2025-10-14T00:47:20,894 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-mdsal-spi/9.0.1 2025-10-14T00:47:20,894 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8040-ietf-restconf-monitoring/14.0.18 2025-10-14T00:47:20,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8650/14.0.18 2025-10-14T00:47:20,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-broker/14.0.18 2025-10-14T00:47:20,905 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for MountPointService activated 2025-10-14T00:47:20,912 | INFO | features-3-thread-1 | DOMRpcRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM RPC/Action router started 2025-10-14T00:47:20,915 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionProviderService activated 2025-10-14T00:47:20,918 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionService activated 2025-10-14T00:47:20,923 | INFO | features-3-thread-1 | DOMNotificationRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM Notification Router started 2025-10-14T00:47:20,925 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationService activated 2025-10-14T00:47:20,925 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService)] 2025-10-14T00:47:20,928 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T00:47:20,928 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationPublishService activated 2025-10-14T00:47:20,931 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T00:47:20,931 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-14T00:47:20,931 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcProviderService activated 2025-10-14T00:47:20,933 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcService activated 2025-10-14T00:47:20,933 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T00:47:20,934 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.api/0.21.2 2025-10-14T00:47:20,935 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8525/14.0.18 2025-10-14T00:47:20,935 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.yanglib-mdsal-writer/9.0.1 2025-10-14T00:47:20,936 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.osgi-impl/0.21.2 2025-10-14T00:47:20,938 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-jaxrs/9.0.1 2025-10-14T00:47:20,941 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-subscription/9.0.1 2025-10-14T00:47:20,944 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.odl-device-notification/9.0.1 2025-10-14T00:47:20,945 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.sal-remote/9.0.1 2025-10-14T00:47:20,946 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.sal-remote-impl/9.0.1 2025-10-14T00:47:20,947 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-mdsal/9.0.1 2025-10-14T00:47:20,951 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server/9.0.1 2025-10-14T00:47:20,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-nb/9.0.1 2025-10-14T00:47:20,956 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-server/14.0.18 2025-10-14T00:47:20,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javassist/3.30.2.GA 2025-10-14T00:47:20,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.external.aopalliance-repackaged/2.6.1 2025-10-14T00:47:20,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.osgi-resource-locator/1.0.3 2025-10-14T00:47:21,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.media.jersey-media-sse/2.47.0 2025-10-14T00:47:21,001 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.rabbitmq.client/5.26.0 2025-10-14T00:47:21,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.jvm/4.2.36 2025-10-14T00:47:21,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.inject.jersey-hk2/2.47.0 2025-10-14T00:47:21,003 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1 2025-10-14T00:47:21,004 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2025-10-14T00:47:21,008 | INFO | features-3-thread-1 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | ReconciliationManager started 2025-10-14T00:47:21,008 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1 2025-10-14T00:47:21,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-api/0.20.1 2025-10-14T00:47:21,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-impl/0.20.1 2025-10-14T00:47:21,013 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 2025-10-14T00:47:21,017 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T00:47:21,025 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T00:47:21,025 | INFO | features-3-thread-1 | OpenflowServiceRecoveryHandlerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Registering openflowplugin service recovery handlers 2025-10-14T00:47:21,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-client/14.0.18 2025-10-14T00:47:21,027 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.of-switch-config-pusher/0.20.1 2025-10-14T00:47:21,029 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.healthchecks/4.2.36 2025-10-14T00:47:21,030 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.atomix-storage/11.0.2 2025-10-14T00:47:21,030 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-akka-segmented-journal/11.0.2 2025-10-14T00:47:21,031 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.bulk-o-matic/0.20.1 2025-10-14T00:47:21,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.containers.jersey-container-servlet/2.47.0 2025-10-14T00:47:21,036 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.topology-manager/0.20.1 2025-10-14T00:47:21,039 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-shell/7.1.7 2025-10-14T00:47:21,044 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.7 2025-10-14T00:47:21,044 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl/0.20.1 2025-10-14T00:47:21,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.keystore-none/9.0.1 2025-10-14T00:47:21,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jdbc.core/4.4.8 2025-10-14T00:47:21,060 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.8 2025-10-14T00:47:21,061 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.password-service-impl/0.21.2 2025-10-14T00:47:21,067 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.blueprint-config/0.20.1 2025-10-14T00:47:21,074 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-remoterpc-connector/11.0.2 2025-10-14T00:47:21,075 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jspecify.jspecify/1.0.0 2025-10-14T00:47:21,076 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | karaf.branding/14.1.3 2025-10-14T00:47:21,076 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.idm-store-h2/0.21.2 2025-10-14T00:47:21,078 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.jetty-auth-log-filter/0.21.2 2025-10-14T00:47:21,079 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-ssh/9.0.1 2025-10-14T00:47:21,079 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-shell/0.20.1 2025-10-14T00:47:21,082 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1. Missing service: [org.opendaylight.serviceutils.srm.spi.RegistryControl, org.opendaylight.mdsal.binding.api.DataBroker] 2025-10-14T00:47:21,082 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.tokenauthrealm/0.21.2 2025-10-14T00:47:21,084 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding-util/14.0.18 2025-10-14T00:47:21,084 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.graphite/4.2.36 2025-10-14T00:47:21,085 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-dom-api/11.0.2 2025-10-14T00:47:21,085 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.utils/2.6.1 2025-10-14T00:47:21,087 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.api/2.6.1 2025-10-14T00:47:21,088 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.locator/2.6.1 2025-10-14T00:47:21,089 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.filterchain/0.21.2 2025-10-14T00:47:21,093 | INFO | features-3-thread-1 | CustomFilterAdapterConfigurationImpl | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg, component.id=120, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2025-10-14T00:47:21,094 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.shiro/0.21.2 2025-10-14T00:47:21,099 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.IIDMStore), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T00:47:21,111 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.21.2 [172]] 2025-10-14T00:47:21,112 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-14T00:47:21,112 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2025-10-14T00:47:21,112 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-14T00:47:21,113 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.activation-api/1.2.2 2025-10-14T00:47:21,117 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-common/2.47.0 2025-10-14T00:47:21,118 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.ws.rs-api/2.1.6 2025-10-14T00:47:21,119 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-server/2.47.0 2025-10-14T00:47:21,241 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-clustering-commons/11.0.2 2025-10-14T00:47:21,250 | INFO | features-3-thread-1 | FileAkkaConfigurationReader | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | File-based Pekko configuration reader enabled 2025-10-14T00:47:21,251 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider starting 2025-10-14T00:47:21,430 | INFO | features-3-thread-1 | ActorSystemProviderImpl | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating new ActorSystem 2025-10-14T00:47:21,812 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Slf4jLogger started 2025-10-14T00:47:22,058 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.171.168:2550] with UID [2085355596663801694] 2025-10-14T00:47:22,069 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Starting up, Pekko version [1.0.3] ... 2025-10-14T00:47:22,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2025-10-14T00:47:22,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Started up successfully 2025-10-14T00:47:22,156 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.171.168:2550#2085355596663801694], selfDc [default]. 2025-10-14T00:47:22,370 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider started 2025-10-14T00:47:22,376 | INFO | features-3-thread-1 | OSGiDatastoreContextIntrospectorFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore Context Introspector activated 2025-10-14T00:47:22,378 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION starting 2025-10-14T00:47:22,647 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : config 2025-10-14T00:47:22,647 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T00:47:22,648 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T00:47:22,654 | INFO | features-3-thread-1 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-config 2025-10-14T00:47:22,714 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-config 2025-10-14T00:47:22,726 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-32 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage MAPPED 2025-10-14T00:47:22,791 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Recovery complete 2025-10-14T00:47:22,806 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RecoveringClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: saving tombstone ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0} 2025-10-14T00:47:22,841 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon#1843328775]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T00:47:22,854 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store config is using tell-based protocol 2025-10-14T00:47:22,858 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T00:47:22,858 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T00:47:22,859 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL starting 2025-10-14T00:47:22,860 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : operational 2025-10-14T00:47:22,860 | INFO | features-3-thread-1 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-operational 2025-10-14T00:47:22,863 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-operational 2025-10-14T00:47:22,875 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-topology-config: Shard created, persistent : true 2025-10-14T00:47:22,875 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Shard created, persistent : true 2025-10-14T00:47:22,876 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-toaster-config: Shard created, persistent : true 2025-10-14T00:47:22,877 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RecoveringClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: saving tombstone ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0} 2025-10-14T00:47:22,879 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store operational is using tell-based protocol 2025-10-14T00:47:22,880 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: Shard created, persistent : true 2025-10-14T00:47:22,882 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Recovery complete 2025-10-14T00:47:22,883 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T00:47:22,883 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T00:47:22,891 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: Shard created, persistent : false 2025-10-14T00:47:22,894 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-operational: Shard created, persistent : false 2025-10-14T00:47:22,892 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-topology-operational: Shard created, persistent : false 2025-10-14T00:47:22,896 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-toaster-operational: Shard created, persistent : false 2025-10-14T00:47:22,908 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-default-config/member-1-shard-default-config-notifier#-731455857 created and ready for shard:member-1-shard-default-config 2025-10-14T00:47:22,910 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-topology-operational/member-1-shard-topology-operational-notifier#-1743276614 created and ready for shard:member-1-shard-topology-operational 2025-10-14T00:47:22,910 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-topology-config/member-1-shard-topology-config-notifier#-2104235062 created and ready for shard:member-1-shard-topology-config 2025-10-14T00:47:22,910 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-inventory-operational/member-1-shard-inventory-operational-notifier#-81268317 created and ready for shard:member-1-shard-inventory-operational 2025-10-14T00:47:22,911 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-inventory-config/member-1-shard-inventory-config-notifier#-240584850 created and ready for shard:member-1-shard-inventory-config 2025-10-14T00:47:22,911 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Starting recovery with journal batch size 1 2025-10-14T00:47:22,912 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Starting recovery with journal batch size 1 2025-10-14T00:47:22,913 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Starting recovery with journal batch size 1 2025-10-14T00:47:22,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-default-operational/member-1-shard-default-operational-notifier#1628907425 created and ready for shard:member-1-shard-default-operational 2025-10-14T00:47:22,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-toaster-operational/member-1-shard-toaster-operational-notifier#-2122123024 created and ready for shard:member-1-shard-toaster-operational 2025-10-14T00:47:22,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-toaster-config/member-1-shard-toaster-config-notifier#895658466 created and ready for shard:member-1-shard-toaster-config 2025-10-14T00:47:22,914 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Starting recovery with journal batch size 1 2025-10-14T00:47:22,914 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Starting recovery with journal batch size 1 2025-10-14T00:47:22,917 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Starting recovery with journal batch size 1 2025-10-14T00:47:22,918 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Starting recovery with journal batch size 1 2025-10-14T00:47:22,921 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Starting recovery with journal batch size 1 2025-10-14T00:47:22,941 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-47 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage DISK 2025-10-14T00:47:22,971 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service starting 2025-10-14T00:47:22,973 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service started 2025-10-14T00:47:22,974 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-akka-raft/11.0.2 2025-10-14T00:47:22,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.common/4.2.6.Final 2025-10-14T00:47:22,979 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.buffer/4.2.6.Final 2025-10-14T00:47:22,979 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-journal/11.0.2 2025-10-14T00:47:22,981 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.blueprint-config/0.20.1 2025-10-14T00:47:22,982 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.truststore-api/9.0.1 2025-10-14T00:47:22,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.keystore-api/9.0.1 2025-10-14T00:47:23,005 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-cluster-admin-impl/11.0.2 2025-10-14T00:47:23,012 | INFO | features-3-thread-1 | Activator | 99 - org.apache.karaf.deployer.features - 4.4.8 | Deployment finished. Registering FeatureDeploymentListener 2025-10-14T00:47:23,077 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: journal open: applyTo=0 2025-10-14T00:47:23,077 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: journal open: applyTo=0 2025-10-14T00:47:23,077 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: journal open: applyTo=0 2025-10-14T00:47:23,077 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: journal open: applyTo=0 2025-10-14T00:47:23,077 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: journal open: applyTo=0 2025-10-14T00:47:23,079 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: journal open: applyTo=0 2025-10-14T00:47:23,079 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: journal open: applyTo=0 2025-10-14T00:47:23,079 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: journal open: applyTo=0 2025-10-14T00:47:23,096 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,097 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,097 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,097 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,096 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,098 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,102 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,105 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,105 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,105 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,106 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,106 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,110 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,112 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from null to Follower 2025-10-14T00:47:23,113 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T00:47:23,113 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,115 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from null to Follower 2025-10-14T00:47:23,115 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , received role change from null to Follower 2025-10-14T00:47:23,116 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T00:47:23,116 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T00:47:23,116 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T00:47:23,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from null to Follower 2025-10-14T00:47:23,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , received role change from null to Follower 2025-10-14T00:47:23,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T00:47:23,117 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-topology-config from null to Follower 2025-10-14T00:47:23,117 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-toaster-config from null to Follower 2025-10-14T00:47:23,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from null to Follower 2025-10-14T00:47:23,117 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from null to Follower 2025-10-14T00:47:23,118 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T00:47:23,118 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-default-config from null to Follower 2025-10-14T00:47:23,118 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T00:47:23,118 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from null to Follower 2025-10-14T00:47:23,119 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from null to Follower 2025-10-14T00:47:23,119 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from null to Follower 2025-10-14T00:47:23,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T00:47:23,120 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from null to Follower 2025-10-14T00:47:23,120 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-14T00:47:23,121 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , received role change from null to Follower 2025-10-14T00:47:23,121 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T00:47:23,121 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-default-operational from null to Follower 2025-10-14T00:47:23,218 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-719231303]], but this node is not initialized yet 2025-10-14T00:47:23,226 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon#1252037169]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T00:47:23,238 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] is JOINING itself (with roles [member-1, dc-default], version [0.0.0]) and forming new cluster 2025-10-14T00:47:23,239 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-10-14T00:47:23,246 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Up] 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.168:2550 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.168:2550 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-14T00:47:23,250 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-14T00:47:23,251 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-14T00:47:23,251 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-14T00:47:23,259 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-10-14T00:47:23,264 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton manager starting singleton actor [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-14T00:47:23,264 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | ClusterSingletonManager state change [Start -> Oldest] 2025-10-14T00:47:23,392 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Done. 2025-10-14T00:47:23,639 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1603891383]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T00:47:23,639 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1603891383]] (version [1.0.3]) 2025-10-14T00:47:23,711 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.161:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2025-10-14T00:47:23,719 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#1235762114] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:47:23,720 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1790704553] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:47:24,177 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.161:2550] to [Up] 2025-10-14T00:47:24,179 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is not the leader any more and not responsible for taking SBR decisions. 2025-10-14T00:47:24,180 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:47:24,180 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T00:47:24,180 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T00:47:24,180 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T00:47:24,181 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T00:47:24,180 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T00:47:24,181 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T00:47:24,181 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T00:47:24,181 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T00:47:24,181 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:47:24,181 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T00:47:24,181 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T00:47:24,182 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T00:47:24,182 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T00:47:24,182 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T00:47:24,182 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T00:47:24,182 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T00:47:24,182 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T00:47:24,264 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton identified at [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-14T00:47:25,194 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - is no longer leader 2025-10-14T00:47:30,971 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-topology-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,000 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-inventory-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,007 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@378be39a 2025-10-14T00:47:31,009 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done false 2025-10-14T00:47:31,018 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4f0ad7e6 2025-10-14T00:47:31,019 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done false 2025-10-14T00:47:31,038 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-inventory-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,048 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@55b68757 2025-10-14T00:47:31,049 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2025-10-14T00:47:31,049 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,059 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2f633439 2025-10-14T00:47:31,059 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done false 2025-10-14T00:47:31,079 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-default-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,087 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3c436a1 2025-10-14T00:47:31,088 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T00:47:31,089 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done false 2025-10-14T00:47:31,093 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type OPERATIONAL activated 2025-10-14T00:47:31,093 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL started 2025-10-14T00:47:31,123 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,123 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-default-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,123 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Follower): Term 2 in "RequestVote{term=2, candidateId=member-3-shard-toaster-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 0 - updating term 2025-10-14T00:47:31,134 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6dda9314 2025-10-14T00:47:31,135 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@15062600 2025-10-14T00:47:31,135 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done false 2025-10-14T00:47:31,135 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done false 2025-10-14T00:47:31,137 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@67cc4811 2025-10-14T00:47:31,138 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T00:47:31,138 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done false 2025-10-14T00:47:31,145 | ERROR | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ConcurrentDOMDataBroker | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | bundle org.opendaylight.controller.sal-distributed-datastore:11.0.2 (196)[org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker(22)] : Constructor argument 1 in class class org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker has unsupported type java.util.concurrent.Executor 2025-10-14T00:47:31,155 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ConcurrentDOMDataBroker | 358 - org.opendaylight.yangtools.util - 14.0.17 | ThreadFactory created: CommitFutures 2025-10-14T00:47:31,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | DataBrokerCommitExecutor | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker commit exector started 2025-10-14T00:47:31,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type CONFIGURATION activated 2025-10-14T00:47:31,160 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ConcurrentDOMDataBroker | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker started 2025-10-14T00:47:31,164 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for DataBroker activated 2025-10-14T00:47:31,200 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), Initial app config TopologyLldpDiscoveryConfig] 2025-10-14T00:47:31,216 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [Initial app config LldpSpeakerConfig] 2025-10-14T00:47:31,225 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config#135908184], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-10-14T00:47:31,226 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config#135908184], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-14T00:47:31,248 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config#135908184], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 22.66 ms 2025-10-14T00:47:31,263 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-14T00:47:31,302 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | EOSClusterSingletonServiceProvider | 257 - org.opendaylight.mdsal.mdsal-singleton-impl - 14.0.18 | Cluster Singleton Service started 2025-10-14T00:47:31,315 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | ietf-yang-library writer registered 2025-10-14T00:47:31,405 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), Initial app config ForwardingRulesManagerConfig] 2025-10-14T00:47:31,411 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | DeviceOwnershipService started 2025-10-14T00:47:31,429 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-14T00:47:31,471 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-14T00:47:31,471 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-14T00:47:31,493 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2025-10-14T00:47:31,499 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | rpc-requests-quota configuration property was changed to '20000' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | global-notification-quota configuration property was changed to '64000' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | switch-features-mandatory configuration property was changed to 'false' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | enable-flow-removed-notification configuration property was changed to 'true' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-rpc-enabled configuration property was changed to 'false' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-count-limit configuration property was changed to '25600' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-interval-timeout-limit configuration property was changed to '500' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | echo-reply-timeout configuration property was changed to '2000' 2025-10-14T00:47:31,500 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-polling-on configuration property was changed to 'true' 2025-10-14T00:47:31,501 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-table-statistics-polling-on configuration property was changed to 'true' 2025-10-14T00:47:31,501 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-flow-statistics-polling-on configuration property was changed to 'true' 2025-10-14T00:47:31,501 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-group-statistics-polling-on configuration property was changed to 'true' 2025-10-14T00:47:31,501 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-meter-statistics-polling-on configuration property was changed to 'true' 2025-10-14T00:47:31,501 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-port-statistics-polling-on configuration property was changed to 'true' 2025-10-14T00:47:31,501 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-queue-statistics-polling-on configuration property was changed to 'true' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | skip-table-features configuration property was changed to 'true' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | basic-timer-delay configuration property was changed to '3000' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | maximum-timer-delay configuration property was changed to '900000' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | use-single-layer-serialization configuration property was changed to 'true' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-min-threads configuration property was changed to '1' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-max-threads configuration property was changed to '32000' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-timeout configuration property was changed to '60' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-rate-limit-per-min configuration property was changed to '0' 2025-10-14T00:47:31,502 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2025-10-14T00:47:31,503 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-datastore-removal-delay configuration property was changed to '500' 2025-10-14T00:47:31,503 | INFO | Blueprint Extender: 1 | OSGiConfigurationServiceFactory | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2025-10-14T00:47:31,516 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg' 2025-10-14T00:47:31,516 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2025-10-14T00:47:31,515 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | DefaultConfigPusher | 301 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.20.1 | DefaultConfigPusher has started. 2025-10-14T00:47:31,521 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 has been started 2025-10-14T00:47:31,521 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.20.1 [309] was successfully created 2025-10-14T00:47:31,525 | INFO | Blueprint Extender: 3 | LLDPSpeaker | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2025-10-14T00:47:31,549 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done true 2025-10-14T00:47:31,552 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done true 2025-10-14T00:47:31,566 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2025-10-14T00:47:31,574 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done true 2025-10-14T00:47:31,617 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done true 2025-10-14T00:47:31,640 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@20a58855 was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-14T00:47:31,625 | WARN | CommitFutures-0 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | Configuration update failed, attempting to continue org.opendaylight.mdsal.common.api.OptimisticLockFailedException: Optimistic lock failed for path /(config:aaa:authn:encrypt:service:config?revision=2024-02-02)aaa-encrypt-service-config at org.opendaylight.controller.cluster.datastore.ShardDataTree.canCommitEntry(ShardDataTree.java:818) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.processNextPendingTransaction(ShardDataTree.java:797) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.startCanCommit(ShardDataTree.java:959) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.CommitCohort.canCommit(CommitCohort.java:135) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.directCommit(FrontendReadWriteTransaction.java:425) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.handleModifyTransaction(FrontendReadWriteTransaction.java:594) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.doHandleRequest(FrontendReadWriteTransaction.java:196) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendTransaction.handleRequest(FrontendTransaction.java:135) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.AbstractFrontendHistory.handleTransactionRequest(AbstractFrontendHistory.java:122) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:133) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) ~[?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) ~[?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) ~[?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) ~[?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) ~[?:?] Caused by: org.opendaylight.yangtools.yang.data.tree.api.ConflictingModificationAppliedException: Node was created by other transaction. at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkConflicting(SchemaAwareApplyOperation.java:69) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkWriteApplicable(SchemaAwareApplyOperation.java:172) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkApplicable(SchemaAwareApplyOperation.java:102) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractNodeContainerModificationStrategy.checkChildPreconditions(AbstractNodeContainerModificationStrategy.java:441) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractNodeContainerModificationStrategy.checkTouchApplicable(AbstractNodeContainerModificationStrategy.java:400) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkApplicable(SchemaAwareApplyOperation.java:101) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.validate(InMemoryDataTreeModification.java:615) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.lockedValidate(InMemoryDataTreeModification.java:625) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.validate(InMemoryDataTreeModification.java:603) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractDataTreeTip.validate(AbstractDataTreeTip.java:33) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.canCommitEntry(ShardDataTree.java:811) ~[bundleFile:?] ... 46 more 2025-10-14T00:47:31,659 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@2a045984 was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-14T00:47:31,660 | INFO | Blueprint Extender: 3 | NodeConnectorInventoryEventTranslator | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | NodeConnectorInventoryEventTranslator has started. 2025-10-14T00:47:31,665 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 has been started 2025-10-14T00:47:31,671 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.20.1 [300] was successfully created 2025-10-14T00:47:31,695 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done true 2025-10-14T00:47:31,695 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done true 2025-10-14T00:47:31,713 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.1 | Topology Manager service started. 2025-10-14T00:47:31,786 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2025-10-14T00:47:31,787 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | LLDPDiscoveryListener started. 2025-10-14T00:47:31,788 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 has been started 2025-10-14T00:47:31,789 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.20.1 [303] was successfully created 2025-10-14T00:47:31,861 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ArbitratorReconciliationManagerImpl | 296 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.20.1 | ArbitratorReconciliationManager has started successfully. 2025-10-14T00:47:31,870 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational#825029883], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-10-14T00:47:31,871 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational#825029883], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-14T00:47:31,872 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational#825029883], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 1.175 ms 2025-10-14T00:47:31,883 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OSGiPasswordServiceConfigBootstrap | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | Listening for password service configuration 2025-10-14T00:47:31,900 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T00:47:31,910 | ERROR | opendaylight-cluster-data-notification-dispatcher-52 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | bundle org.opendaylight.aaa.idm-store-h2:0.21.2 (167)[org.opendaylight.aaa.datastore.h2.H2Store(118)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2025-10-14T00:47:31,924 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | MD-SAL configuration-based SwitchConnectionProviders started 2025-10-14T00:47:31,926 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1 2025-10-14T00:47:31,929 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2025-10-14T00:47:31,931 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default iteration count=20000 2025-10-14T00:47:31,931 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2025-10-14T00:47:31,931 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2025-10-14T00:47:31,952 | INFO | Blueprint Extender: 1 | ForwardingRulesManagerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | ForwardingRulesManager has started successfully. 2025-10-14T00:47:31,944 | INFO | Blueprint Extender: 3 | LazyBindingList | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Using lazy population for lists larger than 16 element(s) 2025-10-14T00:47:31,954 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 has been started 2025-10-14T00:47:31,954 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.20.1 [299] was successfully created 2025-10-14T00:47:31,956 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | H2 IDMStore activated 2025-10-14T00:47:31,957 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T00:47:31,961 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-10-14T00:47:31,963 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-10-14T00:47:31,973 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-10-14T00:47:31,984 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-10-14T00:47:31,997 | INFO | Blueprint Extender: 3 | AaaCertMdsalProvider | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCertMdsalProvider Initialized 2025-10-14T00:47:32,034 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done true 2025-10-14T00:47:32,046 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2025-10-14T00:47:32,049 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2025-10-14T00:47:32,055 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] already present 2025-10-14T00:47:32,055 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] already present 2025-10-14T00:47:32,085 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OSGiClusterAdmin | 193 - org.opendaylight.controller.sal-cluster-admin-impl - 11.0.2 | Cluster Admin services started 2025-10-14T00:47:32,086 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION started 2025-10-14T00:47:32,127 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | AAAEncryptionServiceImpl | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | AAAEncryptionService activated 2025-10-14T00:47:32,128 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | Encryption Service enabled 2025-10-14T00:47:32,131 | INFO | Blueprint Extender: 3 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | Certificate Manager service has been initialized 2025-10-14T00:47:32,136 | INFO | Blueprint Extender: 3 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCert Rpc Service has been initialized 2025-10-14T00:47:32,138 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 has been started 2025-10-14T00:47:32,143 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.cert_0.21.2 [163] was successfully created 2025-10-14T00:47:32,156 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | YangLibraryWriter | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | ietf-yang-library writer started with modules-state enabled 2025-10-14T00:47:32,166 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Checking if default entries must be created in IDM store 2025-10-14T00:47:32,199 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational#-209908324], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent} 2025-10-14T00:47:32,200 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational#-209908324], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2025-10-14T00:47:32,200 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational#-209908324], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} in 699.9 μs 2025-10-14T00:47:32,256 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.1 | Topology node flow:1 is successfully written to the operational datastore. 2025-10-14T00:47:32,355 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational#-1023171196], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2025-10-14T00:47:32,355 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational#-1023171196], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:47:32,356 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational#-1023171196], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 834.4 μs 2025-10-14T00:47:32,382 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2025-10-14T00:47:32,383 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@68bd8afb 2025-10-14T00:47:32,394 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | OnfExtensionProvider | 308 - org.opendaylight.openflowplugin.extension-onf - 0.20.1 | ONF Extension Provider started. 2025-10-14T00:47:32,395 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2025-10-14T00:47:32,397 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@38dc8353 2025-10-14T00:47:32,420 | INFO | Blueprint Extender: 1 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_DOMAINS does not exist, creating it 2025-10-14T00:47:32,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-14T00:47:32,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-14T00:47:32,528 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Created default domain 2025-10-14T00:47:32,535 | INFO | Blueprint Extender: 1 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_ROLES does not exist, creating it 2025-10-14T00:47:32,563 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Created 'admin' role 2025-10-14T00:47:32,579 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Created 'user' role 2025-10-14T00:47:32,686 | INFO | Blueprint Extender: 1 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_USERS does not exist, creating it 2025-10-14T00:47:32,700 | INFO | Blueprint Extender: 1 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_GRANTS does not exist, creating it 2025-10-14T00:47:32,752 | INFO | Blueprint Extender: 1 | AAAShiroProvider | 172 - org.opendaylight.aaa.shiro - 0.21.2 | AAAShiroProvider Session Initiated 2025-10-14T00:47:32,872 | INFO | Blueprint Extender: 1 | IniSecurityManagerFactory | 171 - org.opendaylight.aaa.repackaged-shiro - 0.21.2 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2025-10-14T00:47:32,898 | ERROR | Blueprint Extender: 1 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 9.0.1 | bundle org.opendaylight.netconf.restconf-server-mdsal:9.0.1 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(92)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2025-10-14T00:47:32,980 | INFO | Blueprint Extender: 1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278]] 2025-10-14T00:47:32,981 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/rests'} 2025-10-14T00:47:32,981 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=308, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2025-10-14T00:47:32,981 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/rests'} 2025-10-14T00:47:32,982 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=308, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@2eb5bb67{/rests,null,STOPPED} 2025-10-14T00:47:32,983 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@2eb5bb67{/rests,null,STOPPED} 2025-10-14T00:47:32,986 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-14T00:47:32,987 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]}", size=2} 2025-10-14T00:47:32,987 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T00:47:32,987 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /rests with 4 service(s) 2025-10-14T00:47:32,988 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=308, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2025-10-14T00:47:32,989 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /.well-known with 3 service(s) 2025-10-14T00:47:32,990 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-14T00:47:32,991 | INFO | Blueprint Extender: 1 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@1e5b14a5 2025-10-14T00:47:32,991 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-14T00:47:32,991 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@2eb5bb67{/rests,null,AVAILABLE} 2025-10-14T00:47:32,992 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=308, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2025-10-14T00:47:32,992 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T00:47:32,992 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-14T00:47:32,992 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]}", size=2} 2025-10-14T00:47:32,992 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T00:47:32,993 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T00:47:32,993 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-14T00:47:32,993 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]}", size=1} 2025-10-14T00:47:32,993 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-14T00:47:32,993 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-20,contextPath='/.well-known'} 2025-10-14T00:47:32,993 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2025-10-14T00:47:32,993 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-20,contextPath='/.well-known'} 2025-10-14T00:47:32,994 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@20d8764{/.well-known,null,STOPPED} 2025-10-14T00:47:32,994 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@20d8764{/.well-known,null,STOPPED} 2025-10-14T00:47:32,995 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-18,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]} 2025-10-14T00:47:32,995 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-18,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]}", size=2} 2025-10-14T00:47:32,995 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T00:47:32,995 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-14T00:47:32,995 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2025-10-14T00:47:32,995 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@20d8764{/.well-known,null,AVAILABLE} 2025-10-14T00:47:32,996 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2025-10-14T00:47:32,996 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T00:47:32,996 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-19,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]} 2025-10-14T00:47:32,996 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-19,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]}", size=1} 2025-10-14T00:47:32,996 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-19,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]} 2025-10-14T00:47:33,048 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2025-10-14T00:47:33,049 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2025-10-14T00:47:33,050 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2025-10-14T00:47:33,050 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2025-10-14T00:47:33,077 | INFO | Blueprint Extender: 1 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2025-10-14T00:47:33,077 | INFO | Blueprint Extender: 1 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2025-10-14T00:47:33,122 | INFO | Blueprint Extender: 1 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 9.0.1 | Global RESTCONF northbound pools started 2025-10-14T00:47:33,126 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-24,contextPath='/auth'} 2025-10-14T00:47:33,126 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2025-10-14T00:47:33,126 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-24,contextPath='/auth'} 2025-10-14T00:47:33,127 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@67d66c0e{/auth,null,STOPPED} 2025-10-14T00:47:33,127 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@67d66c0e{/auth,null,STOPPED} 2025-10-14T00:47:33,128 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-14T00:47:33,128 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]}", size=2} 2025-10-14T00:47:33,128 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-14T00:47:33,128 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2025-10-14T00:47:33,128 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.aaa.shiro_0.21.2 [172] registered context path /auth with 4 service(s) 2025-10-14T00:47:33,128 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-14T00:47:33,128 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 has been started 2025-10-14T00:47:33,128 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@67d66c0e{/auth,null,AVAILABLE} 2025-10-14T00:47:33,129 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.21.2 [172] was successfully created 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]}", size=2} 2025-10-14T00:47:33,129 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-14T00:47:33,130 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T00:47:33,130 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-14T00:47:33,130 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T00:47:33,130 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-27,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-14T00:47:33,130 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-27,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]}", size=1} 2025-10-14T00:47:33,130 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-27,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-14T00:47:33,865 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos: Elapsed time 13s, remaining time 286s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=397, STOPPING=0, FAILURE=0} 2025-10-14T00:47:33,865 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2025-10-14T00:47:33,865 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | Now notifying all its registered SystemReadyListeners... 2025-10-14T00:47:33,865 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | onSystemBootReady() received, starting the switch connections 2025-10-14T00:47:33,974 | INFO | epollEventLoopGroup-4-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2025-10-14T00:47:33,974 | INFO | epollEventLoopGroup-2-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2025-10-14T00:47:33,975 | INFO | epollEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2025-10-14T00:47:33,975 | INFO | epollEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2025-10-14T00:47:33,975 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@68bd8afb started 2025-10-14T00:47:33,975 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@38dc8353 started 2025-10-14T00:47:33,975 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | All switchConnectionProviders are up and running (2). 2025-10-14T00:47:35,246 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-719231303]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T00:47:35,246 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-719231303]] (version [1.0.3]) 2025-10-14T00:47:35,284 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.116:2550] is JOINING, roles [member-2, dc-default], version [0.0.0] 2025-10-14T00:47:35,285 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#1235762114] was unhandled. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:47:35,285 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1790704553] was unhandled. [4] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:47:38,265 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:47:38,265 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:47:38,265 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-14T00:47:38,265 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-14T00:47:38,265 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-14T00:47:38,266 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-14T00:47:38,267 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-14T00:47:38,267 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-10-14T00:47:38,267 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-14T00:47:38,267 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-14T00:47:38,267 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-14T00:50:20,364 | INFO | sshd-SshServer[441be407](port=8101)-nio2-thread-1 | OpenSSHKeyPairProvider | 121 - org.apache.karaf.shell.ssh - 4.4.8 | Creating ssh server private key at /tmp/karaf-0.23.0/etc/host.key 2025-10-14T00:50:20,366 | INFO | sshd-SshServer[441be407](port=8101)-nio2-thread-1 | OpenSSHKeyPairGenerator | 121 - org.apache.karaf.shell.ssh - 4.4.8 | generateKeyPair(RSA) generating host key - size=2048 2025-10-14T00:50:20,755 | INFO | sshd-SshServer[441be407](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.171.102:39078 authenticated 2025-10-14T00:50:24,326 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot 2025-10-14T00:50:25,345 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables 2025-10-14T00:50:26,205 | INFO | qtp1873851465-580 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-14T00:50:26,210 | INFO | qtp1873851465-580 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-14T00:50:26,915 | INFO | qtp1873851465-580 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication is now enabled 2025-10-14T00:50:26,916 | INFO | qtp1873851465-580 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication Manager activated 2025-10-14T00:50:26,960 | INFO | qtp1873851465-580 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 9.0.1 | Consecutive slashes in REST URLs will be rejected 2025-10-14T00:50:33,719 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart 2025-10-14T00:50:46,585 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1 2025-10-14T00:50:46,958 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower 2025-10-14T00:50:47,543 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader 2025-10-14T00:50:47,962 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart 2025-10-14T00:50:48,364 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit 2025-10-14T00:50:48,759 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1 2025-10-14T00:50:49,151 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1 2025-10-14T00:50:49,572 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 2025-10-14T00:50:50,023 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2 2025-10-14T00:50:50,401 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2 2025-10-14T00:50:50,810 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2 2025-10-14T00:50:51,519 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader 2025-10-14T00:50:51,891 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader 2025-10-14T00:50:52,315 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader 2025-10-14T00:50:52,689 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1 2025-10-14T00:50:53,051 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader 2025-10-14T00:50:54,584 | INFO | sshd-SshServer[441be407](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.171.102:57602 authenticated 2025-10-14T00:50:55,365 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot 2025-10-14T00:50:55,803 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables 2025-10-14T00:51:05,554 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart 2025-10-14T00:51:06,728 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node 2025-10-14T00:51:07,030 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL3 10.30.171.161" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL3 10.30.171.161 2025-10-14T00:51:07,567 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: The connection closed with error: Connection reset 2025-10-14T00:51:10,192 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown 2025-10-14T00:51:12,346 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.161:2550, Up)]. 2025-10-14T00:51:12,349 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:51:12,349 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:51:12,351 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config#135908184], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config#135908184], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-14T00:51:12,352 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational#825029883], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational#825029883], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-14T00:51:12,353 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: refreshing backend for shard 0 2025-10-14T00:51:12,353 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 0 2025-10-14T00:51:12,354 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational#-209908324], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational#-209908324], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2025-10-14T00:51:12,354 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: refreshing backend for shard 1 2025-10-14T00:51:12,355 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational#-1023171196], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational#-1023171196], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:51:12,355 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: refreshing backend for shard 2 2025-10-14T00:51:15,333 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:15,356 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1790704553] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:51:15,356 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/clusterReceptionist/replicator#24836848] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:51:15,356 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-1148832680] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:51:15,357 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-1148832680] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:51:15,357 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-1148832680] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:51:15,358 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-1148832680] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:51:17,373 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T00:51:17,387 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Candidate): Starting new election term 3 2025-10-14T00:51:17,387 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2025-10-14T00:51:17,388 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@77ddea8b 2025-10-14T00:51:17,388 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Follower to Candidate 2025-10-14T00:51:17,388 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Follower to Candidate 2025-10-14T00:51:17,403 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T00:51:17,403 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T00:51:17,406 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Candidate): Starting new election term 3 2025-10-14T00:51:17,406 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2025-10-14T00:51:17,407 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Follower to Candidate 2025-10-14T00:51:17,407 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@57f32bb6 2025-10-14T00:51:17,407 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Follower to Candidate 2025-10-14T00:51:17,409 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Candidate): Starting new election term 3 2025-10-14T00:51:17,410 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2025-10-14T00:51:17,410 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Follower to Candidate 2025-10-14T00:51:17,410 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6fc01c9 2025-10-14T00:51:17,410 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Follower to Candidate 2025-10-14T00:51:17,422 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2025-10-14T00:51:17,423 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2017c1e0 2025-10-14T00:51:17,423 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Candidate to Leader 2025-10-14T00:51:17,423 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Candidate to Leader 2025-10-14T00:51:17,423 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T00:51:17,423 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T00:51:17,424 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2025-10-14T00:51:17,424 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Candidate to Leader 2025-10-14T00:51:17,424 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@54a6cdb0 2025-10-14T00:51:17,424 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Candidate to Leader 2025-10-14T00:51:17,429 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Candidate): Starting new election term 3 2025-10-14T00:51:17,429 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Candidate): Starting new election term 3 2025-10-14T00:51:17,429 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2025-10-14T00:51:17,429 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2025-10-14T00:51:17,429 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Follower to Candidate 2025-10-14T00:51:17,429 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@5d52835a 2025-10-14T00:51:17,430 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , received role change from Follower to Candidate 2025-10-14T00:51:17,430 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@529ce96b 2025-10-14T00:51:17,430 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-default-config from Follower to Candidate 2025-10-14T00:51:17,430 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Follower to Candidate 2025-10-14T00:51:17,440 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2025-10-14T00:51:17,441 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@1030a12c 2025-10-14T00:51:17,442 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2025-10-14T00:51:17,442 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@433e585d 2025-10-14T00:51:17,443 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , received role change from Candidate to Leader 2025-10-14T00:51:17,443 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-default-config from Candidate to Leader 2025-10-14T00:51:17,447 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Candidate to Leader 2025-10-14T00:51:17,447 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Candidate to Leader 2025-10-14T00:51:17,449 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#-987011148], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-10-14T00:51:17,449 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#756004085], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-10-14T00:51:17,449 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config#135908184], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#-987011148], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-10-14T00:51:17,449 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational#825029883], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#756004085], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-10-14T00:51:17,452 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config#135908184], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#-987011148], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 2.883 ms 2025-10-14T00:51:17,452 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational#825029883], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#756004085], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 2.824 ms 2025-10-14T00:51:17,452 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T00:51:17,452 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T00:51:17,456 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Candidate): Starting new election term 3 2025-10-14T00:51:17,456 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2025-10-14T00:51:17,456 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Follower to Candidate 2025-10-14T00:51:17,456 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3f9b31c2 2025-10-14T00:51:17,457 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Follower to Candidate 2025-10-14T00:51:17,457 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate): Starting new election term 3 2025-10-14T00:51:17,457 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2025-10-14T00:51:17,457 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Follower to Candidate 2025-10-14T00:51:17,457 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6f912ccc 2025-10-14T00:51:17,457 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Follower to Candidate 2025-10-14T00:51:17,462 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-10-14T00:51:17,463 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2025-10-14T00:51:17,464 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Candidate to Leader 2025-10-14T00:51:17,464 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@54afa91b 2025-10-14T00:51:17,465 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Candidate to Leader 2025-10-14T00:51:17,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#826717698], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=present} 2025-10-14T00:51:17,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational#-209908324], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#826717698], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=present}} 2025-10-14T00:51:17,467 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational#-209908324], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#826717698], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=present}} in 464.0 μs 2025-10-14T00:51:17,477 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@98b5d5e 2025-10-14T00:51:17,477 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done false 2025-10-14T00:51:17,991 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done true 2025-10-14T00:51:21,560 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:51:21,560 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Association to [pekko://opendaylight-cluster-data@10.30.171.161:2550] with UID [-8579374529155840044] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-10-14T00:51:21,560 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:51:23,142 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:27,432 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Candidate): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-inventory-operational, lastLogIndex=5, lastLogTerm=2}" message is greater than Candidate's term 3 - switching to Follower 2025-10-14T00:51:27,439 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 4 2025-10-14T00:51:27,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Candidate to Follower 2025-10-14T00:51:27,439 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Candidate to Follower 2025-10-14T00:51:27,441 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@77d4212e 2025-10-14T00:51:27,442 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T00:51:27,442 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done false 2025-10-14T00:51:27,446 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done true 2025-10-14T00:51:27,451 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2025-10-14T00:51:27,451 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational#-1023171196], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:51:27,452 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational#-1023171196], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 544.8 μs 2025-10-14T00:51:27,491 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-inventory-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 3 - switching to Follower 2025-10-14T00:51:27,497 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 4 2025-10-14T00:51:27,498 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Candidate to Follower 2025-10-14T00:51:27,498 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Candidate to Follower 2025-10-14T00:51:27,499 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@46996339 2025-10-14T00:51:27,500 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T00:51:27,500 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2025-10-14T00:51:27,823 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:28,018 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2025-10-14T00:51:31,956 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown 2025-10-14T00:51:32,792 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1 2025-10-14T00:51:34,573 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:35,851 | INFO | epollEventLoopGroup-5-1 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.171.222:54836, NodeId:null 2025-10-14T00:51:35,918 | INFO | epollEventLoopGroup-5-2 | ConnectionAdapterImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Hello received 2025-10-14T00:51:36,042 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower 2025-10-14T00:51:36,283 | INFO | qtp1873851465-181 | StaticConfiguration | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding-over-DOM codec shortcuts are enabled 2025-10-14T00:51:36,297 | INFO | qtp1873851465-181 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Ping Pong Flow Tester Impl 2025-10-14T00:51:36,297 | INFO | qtp1873851465-181 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Transaction Chain Flow Writer Impl 2025-10-14T00:51:36,299 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Number of Txn for dpId: openflow:1 is: 1 2025-10-14T00:51:36,299 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@3f68573f for dpid: openflow:1 2025-10-14T00:51:36,383 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:51:36,383 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:51:36,384 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 521.3 μs 2025-10-14T00:51:36,411 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connected. 2025-10-14T00:51:36,411 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | No context chain found for device: openflow:1, creating new. 2025-10-14T00:51:36,412 | INFO | epollEventLoopGroup-5-2 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Device connected to controller, Device:/10.30.171.222:54844, NodeId:Uri{value=openflow:1} 2025-10-14T00:51:36,443 | INFO | epollEventLoopGroup-5-2 | RoleContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2025-10-14T00:51:36,534 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-14T00:51:36,576 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T00:51:36,613 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-14T00:51:36,614 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting DeviceContextImpl[NEW] service for node openflow:1 2025-10-14T00:51:36,638 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RpcContextImpl[NEW] service for node openflow:1 2025-10-14T00:51:36,649 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:36,697 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2025-10-14T00:51:36,698 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RoleContextImpl[NEW] service for node openflow:1 2025-10-14T00:51:36,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2025-10-14T00:51:36,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Requesting state change to BECOMEMASTER 2025-10-14T00:51:36,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2025-10-14T00:51:36,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | getGenerationIdFromDevice called for device: openflow:1 2025-10-14T00:51:36,710 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started clustering services for node openflow:1 2025-10-14T00:51:36,711 | INFO | epollEventLoopGroup-5-2 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-14T00:51:36,713 | INFO | epollEventLoopGroup-5-2 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-14T00:51:36,724 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2025-10-14T00:51:36,761 | INFO | pool-16-thread-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connection is enabled by reconciliation framework. 2025-10-14T00:51:36,787 | INFO | epollEventLoopGroup-5-2 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.171.222}} 2025-10-14T00:51:36,787 | INFO | epollEventLoopGroup-5-2 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Port number of the node openflow:1 is: 54844 2025-10-14T00:51:36,862 | INFO | epollEventLoopGroup-5-2 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2025-10-14T00:51:36,866 | INFO | epollEventLoopGroup-5-2 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2025-10-14T00:51:36,880 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.MacAddress 2025-10-14T00:51:36,880 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.PhysAddress 2025-10-14T00:51:36,881 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.HexString 2025-10-14T00:51:36,881 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.DottedQuad 2025-10-14T00:51:36,881 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.Uuid 2025-10-14T00:51:36,883 | INFO | epollEventLoopGroup-5-2 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPPORTDESC collected 2025-10-14T00:51:36,906 | INFO | epollEventLoopGroup-5-2 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 successfully finished collecting 2025-10-14T00:51:37,008 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-14T00:51:37,017 | INFO | pool-16-thread-1 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 is able to work as master 2025-10-14T00:51:37,023 | INFO | pool-16-thread-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role MASTER was granted to device openflow:1 2025-10-14T00:51:37,024 | INFO | pool-16-thread-1 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node added notification for Uri{value=openflow:1} 2025-10-14T00:51:37,026 | INFO | pool-16-thread-1 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting statistics gathering for node openflow:1 2025-10-14T00:51:37,064 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | LazyBindingMap | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Using lazy population for maps larger than 1 element(s) 2025-10-14T00:51:37,149 | WARN | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Seems like device is still owned by other controller instance. Skip deleting openflow:1 node from operational datastore. 2025-10-14T00:51:37,444 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed FlowHandlerTask thread for dpid: openflow:1 2025-10-14T00:51:37,869 | INFO | CommitFutures-3 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed all flows installation for: dpid: openflow:1 in 1571845116ns 2025-10-14T00:51:37,891 | INFO | CommitFutures-3 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@3f68573f closed successfully. 2025-10-14T00:51:38,615 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader 2025-10-14T00:51:38,741 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:39,251 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:41,336 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:48,670 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:50,690 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:51,209 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:51:51,743 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart 2025-10-14T00:51:52,127 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node 2025-10-14T00:51:52,361 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL3 10.30.171.161" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL3 10.30.171.161 2025-10-14T00:51:52,446 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=19312, lastAppliedTerm=4, lastIndex=20008, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=696, mandatoryTrim=false] 2025-10-14T00:51:52,452 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=19312, term=4]/EntryInfo[index=20008, term=4] 2025-10-14T00:51:52,454 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 19312 and term: 4 2025-10-14T00:51:52,525 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: snapshot is durable as of 2025-10-14T00:51:52.453266647Z 2025-10-14T00:51:59,020 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:52:00,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-250083929]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T00:52:00,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-250083929]] (version [1.0.3]) 2025-10-14T00:52:00,243 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.161:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2025-10-14T00:52:01,299 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T00:52:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T00:52:01,302 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T00:52:01,302 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T00:52:01,301 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T00:52:05,888 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 87, snapshotTerm: 2, replicatedToAllIndex: -1 2025-10-14T00:52:05,890 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): follower member-3-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T00:52:05,890 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): Initiating install snapshot to follower member-3-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 87, leader lastIndex: 124, leader log size: 37 2025-10-14T00:52:05,890 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=124, lastAppliedTerm=3, lastIndex=124, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-default-operational 2025-10-14T00:52:05,893 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 1, snapshotTerm: 2, replicatedToAllIndex: -1 2025-10-14T00:52:05,893 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): follower member-3-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T00:52:05,893 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): Initiating install snapshot to follower member-3-shard-topology-operational: follower nextIndex: 0, leader snapshotIndex: 1, leader lastIndex: 5, leader log size: 4 2025-10-14T00:52:05,893 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=5, lastAppliedTerm=3, lastIndex=5, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-topology-operational 2025-10-14T00:52:05,896 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Persising snapshot at EntryInfo[index=5, term=3]/EntryInfo[index=5, term=3] 2025-10-14T00:52:05,916 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 1 and term: 2 2025-10-14T00:52:05,917 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: snapshot is durable as of 2025-10-14T00:52:05.901125796Z 2025-10-14T00:52:05,917 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Persising snapshot at EntryInfo[index=124, term=3]/EntryInfo[index=124, term=3] 2025-10-14T00:52:05,918 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 87 and term: 2 2025-10-14T00:52:05,922 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: snapshot is durable as of 2025-10-14T00:52:05.918325789Z 2025-10-14T00:52:06,070 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): Snapshot successfully installed on follower member-3-shard-topology-operational (last chunk 1) - matchIndex set to 5, nextIndex set to 6 2025-10-14T00:52:06,101 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): Snapshot successfully installed on follower member-3-shard-default-operational (last chunk 1) - matchIndex set to 124, nextIndex set to 125 2025-10-14T00:52:06,179 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=0}, nanosAgo=48736417152, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1} 2025-10-14T00:52:06,457 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, nanosAgo=49010329835, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1} 2025-10-14T00:52:16,859 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart 2025-10-14T00:52:22,013 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart 2025-10-14T00:52:22,351 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | TransmitQueue | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | No request matching Envelope{sessionId=5, txSequence=123, message=ModifyTransactionSuccess{target=member-1-datastore-operational-fe-0-txn-45-2, sequence=1}} found, ignoring response 2025-10-14T00:52:35,319 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart 2025-10-14T00:52:35,794 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1 2025-10-14T00:52:36,301 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=39941, lastAppliedTerm=4, lastIndex=40008, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=67, mandatoryTrim=false] 2025-10-14T00:52:36,302 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=39941, term=4]/EntryInfo[index=40008, term=4] 2025-10-14T00:52:36,302 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 21008 and term: 4 2025-10-14T00:52:36,349 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: snapshot is durable as of 2025-10-14T00:52:36.302509228Z 2025-10-14T00:52:36,513 | INFO | epollEventLoopGroup-5-2 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.171.222:54844, NodeId:openflow:1 2025-10-14T00:52:36,514 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 disconnected. 2025-10-14T00:52:36,514 | INFO | epollEventLoopGroup-5-2 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-14T00:52:36,524 | INFO | epollEventLoopGroup-5-2 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node removed notification for Uri{value=openflow:1} 2025-10-14T00:52:36,525 | INFO | epollEventLoopGroup-5-2 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-14T00:52:36,525 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role SLAVE was granted to device openflow:1 2025-10-14T00:52:36,526 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2025-10-14T00:52:36,526 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2025-10-14T00:52:36,526 | INFO | epollEventLoopGroup-5-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-14T00:52:36,605 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-14T00:52:36,910 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2025-10-14T00:52:36,911 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2025-10-14T00:52:36,915 | INFO | epollEventLoopGroup-5-2 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services registration for node openflow:1 2025-10-14T00:52:36,915 | INFO | ofppool-1 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services for node openflow:1 2025-10-14T00:52:36,917 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2025-10-14T00:52:36,918 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2025-10-14T00:52:36,918 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2025-10-14T00:52:36,918 | INFO | epollEventLoopGroup-5-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-14T00:52:36,919 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2025-10-14T00:52:36,984 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-14T00:52:37,497 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-14T00:52:38,504 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1 2025-10-14T00:52:38,814 | INFO | qtp1873851465-181 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Ping Pong Flow Tester Impl 2025-10-14T00:52:38,814 | INFO | qtp1873851465-181 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Transaction Chain Flow Writer Impl 2025-10-14T00:52:38,816 | INFO | ForkJoinPool-10-worker-2 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Number of Txn for dpId: openflow:1 is: 1 2025-10-14T00:52:38,816 | INFO | ForkJoinPool-10-worker-2 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@2247a358 for dpid: openflow:1 2025-10-14T00:52:38,882 | INFO | ForkJoinPool-10-worker-2 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed FlowHandlerTask thread for dpid: openflow:1 2025-10-14T00:52:38,946 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] 2025-10-14T00:52:38,995 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | newPosition > limit: (5480563 > 262064) 2025-10-14T00:52:39,000 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:52:39,001 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:53:08,483 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:53:08,484 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: refreshing backend for shard 2 2025-10-14T00:53:08,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2025-10-14T00:53:08,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:53:08,492 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 2.888 ms 2025-10-14T00:53:38,853 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:53:38,853 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:53:38,857 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:53:38,857 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:53:38,873 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 15.21 ms 2025-10-14T00:54:19,758 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node 2025-10-14T00:54:20,232 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,234 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,235 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,236 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,237 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,238 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,243 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,245 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,246 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,247 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,248 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,249 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,249 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,250 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,251 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,251 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,252 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,253 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,253 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,254 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,255 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,255 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,256 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,256 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,257 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,258 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,259 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,259 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,260 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,260 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,261 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,262 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,263 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,263 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,272 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,273 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,275 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,276 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,277 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:20,278 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:20,279 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:21,273 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:21,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:21,276 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:21,278 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:21,791 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:21,793 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:21,793 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:21,794 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,313 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,314 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,315 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,316 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,831 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,832 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,833 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,834 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,835 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,836 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,839 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,840 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,841 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,842 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,842 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,843 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:22,844 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:22,845 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:23,354 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:23,357 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:23,357 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:23,358 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:23,872 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:23,874 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:24,394 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:24,396 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:24,396 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (5480563 > 262064) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T00:54:24,397 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T00:54:38,893 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:54:38,895 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:54:38,903 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:54:38,904 | INFO | CommitFutures-5 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed all flows installation for: dpid: openflow:1 in 692099378834ns 2025-10-14T00:54:38,904 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:54:38,905 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 1.423 ms 2025-10-14T00:54:38,906 | ERROR | CommitFutures-5 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error: TransactionCommitFailedException{message=canCommit encountered an unexpected failure, errorList=[RpcError [message=canCommit encountered an unexpected failure, severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-1-datastore-config-fe-0-chn-8-txn-0-1, sequence=11, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#929207398], modifications=0, protocol=SIMPLE} timed out after 120.013726905 seconds. The backend for inventory is not available.]]} in Datastore write operation: dpid: openflow:1, begin tableId: 0, end tableId: 1, sourceIp: 10001 2025-10-14T00:54:38,903 | ERROR | CommitFutures-4 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@2247a358 FAILED due to: org.opendaylight.mdsal.common.api.TransactionCommitFailedException: canCommit encountered an unexpected failure at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:42) ~[bundleFile:14.0.18] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:18) ~[bundleFile:14.0.18] at org.opendaylight.yangtools.util.concurrent.ExceptionMapper.apply(ExceptionMapper.java:98) ~[bundleFile:14.0.17] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.apply(TransactionCommitFailedExceptionMapper.java:37) ~[bundleFile:14.0.18] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker.handleException(ConcurrentDOMDataBroker.java:189) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker$1.onFailure(ConcurrentDOMDataBroker.java:133) ~[bundleFile:?] at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1117) ~[bundleFile:?] at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1004) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:767) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:516) ~[bundleFile:?] at com.google.common.util.concurrent.SettableFuture.setException(SettableFuture.java:54) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.lambda$directCommit$4(AbstractProxyTransaction.java:510) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$handleReplayedModifyTransactionRequest$16(RemoteProxyTransaction.java:510) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) ~[?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) ~[?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) ~[?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) ~[?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) ~[?:?] Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-1-datastore-config-fe-0-chn-8-txn-0-1, sequence=11, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#929207398], modifications=0, protocol=SIMPLE} timed out after 120.013726905 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[bundleFile:?] ... 26 more 2025-10-14T00:55:08,922 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:55:08,923 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:55:08,928 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:55:08,928 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:55:08,929 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 872.7 μs 2025-10-14T00:55:38,943 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:55:38,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:55:38,947 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:55:38,947 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:55:38,948 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 615.8 μs 2025-10-14T00:56:01,159 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart 2025-10-14T00:56:08,963 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:56:08,963 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:56:08,966 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:56:08,966 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:56:08,967 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 479.9 μs 2025-10-14T00:56:13,072 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node 2025-10-14T00:56:13,482 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown 2025-10-14T00:56:38,923 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:56:38,924 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:56:38,937 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:56:38,937 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:56:38,938 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 501.4 μs 2025-10-14T00:57:49,581 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClusterHeartbeat | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Scheduled sending of heartbeat was delayed. Previous heartbeat was sent [5798] ms ago, expected interval is [1000] ms. This may cause failure detection to mark members as unreachable. The reason can be thread starvation, CPU overload, or GC. 2025-10-14T00:57:49,581 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:57:49,582 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:57:49,582 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:57:49,582 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:57:49,583 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:57:49,583 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:57:49,583 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:57:49,583 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: refreshing backend for shard 2 2025-10-14T00:57:49,582 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.116:2550, Up)]. 2025-10-14T00:57:49,584 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.161:2550, Up)]. 2025-10-14T00:57:49,584 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-10-14T00:57:49,584 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-10-14T00:57:49,586 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.161:2550,-2733844580753472048)] 2025-10-14T00:57:49,587 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.116:2550,7634712660374977560)] 2025-10-14T00:57:49,588 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.161:2550,-2733844580753472048)] 2025-10-14T00:57:49,588 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.161:2550,-2733844580753472048)] 2025-10-14T00:57:49,590 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.116:2550,7634712660374977560)] 2025-10-14T00:57:49,592 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.116:2550,7634712660374977560)] 2025-10-14T00:57:50,586 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Marking node as REACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.116:2550, Up)]. 2025-10-14T00:57:50,586 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Marking node as REACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.161:2550, Up)]. 2025-10-14T00:57:50,587 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - is no longer leader 2025-10-14T00:57:50,587 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:57:50,587 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2025-10-14T00:57:57.588262359Z. 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR found all unreachable members healed during stable-after period, no downing decision necessary for now. 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:57:50,588 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is not the leader any more and not responsible for taking SBR decisions. 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-14T00:57:50,589 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T00:57:50,590 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T00:57:50,590 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T00:57:50,590 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T00:57:50,590 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T00:57:50,590 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T00:57:50,590 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T00:57:50,591 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T00:57:50,590 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T00:57:50,591 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T00:57:50,592 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2025-10-14T00:57:50,592 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:57:50,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-14T00:57:50,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:57:50,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 281.8 μs 2025-10-14T00:57:50,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 989.3 μs 2025-10-14T00:57:54,422 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node 2025-10-14T00:57:57,403 | INFO | epollEventLoopGroup-5-3 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.171.222:48638, NodeId:null 2025-10-14T00:57:57,503 | INFO | epollEventLoopGroup-5-4 | ConnectionAdapterImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Hello received 2025-10-14T00:57:57,656 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1 2025-10-14T00:57:57,977 | INFO | qtp1873851465-671 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Ping Pong Flow Tester Impl 2025-10-14T00:57:57,978 | INFO | qtp1873851465-671 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Transaction Chain Flow Writer Impl 2025-10-14T00:57:57,979 | INFO | ForkJoinPool-10-worker-3 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Number of Txn for dpId: openflow:1 is: 1 2025-10-14T00:57:57,980 | INFO | ForkJoinPool-10-worker-3 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@187651bc for dpid: openflow:1 2025-10-14T00:57:58,002 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connected. 2025-10-14T00:57:58,002 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | No context chain found for device: openflow:1, creating new. 2025-10-14T00:57:58,003 | INFO | epollEventLoopGroup-5-4 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Device connected to controller, Device:/10.30.171.222:48642, NodeId:Uri{value=openflow:1} 2025-10-14T00:57:58,004 | INFO | epollEventLoopGroup-5-4 | RoleContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2025-10-14T00:57:58,064 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-14T00:57:58,143 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-14T00:57:58,143 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting DeviceContextImpl[NEW] service for node openflow:1 2025-10-14T00:57:58,144 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RpcContextImpl[NEW] service for node openflow:1 2025-10-14T00:57:58,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2025-10-14T00:57:58,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RoleContextImpl[NEW] service for node openflow:1 2025-10-14T00:57:58,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2025-10-14T00:57:58,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Requesting state change to BECOMEMASTER 2025-10-14T00:57:58,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2025-10-14T00:57:58,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | getGenerationIdFromDevice called for device: openflow:1 2025-10-14T00:57:58,154 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started clustering services for node openflow:1 2025-10-14T00:57:58,155 | INFO | epollEventLoopGroup-5-4 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-14T00:57:58,156 | INFO | epollEventLoopGroup-5-4 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-14T00:57:58,157 | INFO | ofppool-1 | FlowNodeReconciliationImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2025-10-14T00:57:58,271 | INFO | ForkJoinPool-10-worker-3 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed FlowHandlerTask thread for dpid: openflow:1 2025-10-14T00:57:58,796 | INFO | pool-16-thread-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connection is enabled by reconciliation framework. 2025-10-14T00:57:58,940 | INFO | epollEventLoopGroup-5-4 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.171.222}} 2025-10-14T00:57:58,940 | INFO | epollEventLoopGroup-5-4 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Port number of the node openflow:1 is: 48642 2025-10-14T00:57:58,982 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2025-10-14T00:57:58,983 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2025-10-14T00:57:58,984 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPPORTDESC collected 2025-10-14T00:57:58,985 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 successfully finished collecting 2025-10-14T00:57:59,002 | INFO | pool-16-thread-2 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 is able to work as master 2025-10-14T00:57:59,002 | INFO | pool-16-thread-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role MASTER was granted to device openflow:1 2025-10-14T00:57:59,003 | INFO | pool-16-thread-2 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node added notification for Uri{value=openflow:1} 2025-10-14T00:57:59,003 | INFO | pool-16-thread-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting statistics gathering for node openflow:1 2025-10-14T00:57:59,017 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-14T00:58:18,765 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberLeft] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1790704553] was unhandled. [477] dead letters encountered, of which 466 were not logged. The counter will be reset now. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:18,766 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberLeft] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#1235762114] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:19,040 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberExited: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:58:19,040 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberExited: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:58:19,040 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberExited] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#1235762114] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:19,041 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberExited] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1790704553] was unhandled. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:19,042 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Exiting confirmed [pekko://opendaylight-cluster-data@10.30.170.116:2550] 2025-10-14T00:58:20,174 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:58:20,173 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T00:58:25,193 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Association to [pekko://opendaylight-cluster-data@10.30.170.116:2550] having UID [7634712660374977560] has been stopped. All messages to this UID will be delivered to dead letters. Reason: Cluster member removed, previous status [Exiting] 2025-10-14T00:58:28,418 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Candidate): Starting new election term 4 2025-10-14T00:58:28,418 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-14T00:58:28,419 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Follower to Candidate 2025-10-14T00:58:28,419 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7bdc9537 2025-10-14T00:58:28,419 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-topology-config from Follower to Candidate 2025-10-14T00:58:28,438 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-14T00:58:28,439 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@64299299 2025-10-14T00:58:28,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Candidate to Leader 2025-10-14T00:58:28,439 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-topology-config from Candidate to Leader 2025-10-14T00:58:28,968 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-10-14T00:58:28,968 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], control stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-10-14T00:58:29,484 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-10-14T00:58:32,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-10-14T00:58:33,124 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-10-14T00:58:33,489 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Candidate): Starting new election term 5 2025-10-14T00:58:33,490 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2025-10-14T00:58:33,491 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Follower to Candidate 2025-10-14T00:58:33,491 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@67a62b73 2025-10-14T00:58:33,491 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Follower to Candidate 2025-10-14T00:58:33,492 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-10-14T00:58:33,492 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: refreshing backend for shard 2 2025-10-14T00:58:33,576 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 5 2025-10-14T00:58:33,577 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@cf38e51 2025-10-14T00:58:33,577 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Candidate to Leader 2025-10-14T00:58:33,577 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Candidate to Leader 2025-10-14T00:58:33,577 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T00:58:33,579 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present} 2025-10-14T00:58:33,580 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present}} 2025-10-14T00:58:33,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1075765873], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present}} in 12.90 ms 2025-10-14T00:58:34,615 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: The connection closed with error: Connection reset 2025-10-14T00:58:34,795 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:35,724 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-toaster-operational#1844318297] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:35,724 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#756004085] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:35,724 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-toaster-config#935385975] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:35,725 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#826717698] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:35,725 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:35,725 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-topology-config#-2135773793] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:35,726 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#-987011148] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T00:58:35,832 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:36,870 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:37,391 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:37,912 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:38,950 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:42,071 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:42,590 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:44,152 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:46,231 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:47,271 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:48,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:51,434 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:51,952 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:53,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:58:58,023 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config#1467606537], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-14T00:58:58,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: refreshing backend for shard 1 2025-10-14T00:58:58,152 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:00,230 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:02,312 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:02,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:03,354 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:03,873 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:04,390 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:05,429 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:05,950 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:06,439 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:06,990 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:07,510 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:08,030 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:08,553 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:09,070 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:09,598 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:10,121 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:11,681 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:12,199 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:13,790 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:14,310 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:14,829 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:15,870 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:18,047 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T00:59:18,924 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:22,549 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:23,151 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:25,748 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:26,269 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:26,789 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:27,308 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:27,830 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:29,911 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:30,430 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:30,949 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:33,032 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:34,070 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:35,629 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:38,229 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:38,750 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:39,051 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart 2025-10-14T00:59:39,083 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T00:59:39,789 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:40,309 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:40,829 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:42,394 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:43,429 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:45,508 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:46,028 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:46,549 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:47,589 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:49,147 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:49,717 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart 2025-10-14T00:59:50,098 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node 2025-10-14T00:59:50,199 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:50,452 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart 2025-10-14T00:59:51,182 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:53,320 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:53,839 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T00:59:58,295 | INFO | CommitFutures-7 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed all flows installation for: dpid: openflow:1 in 120316554461ns 2025-10-14T00:59:58,294 | ERROR | CommitFutures-6 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@187651bc FAILED due to: org.opendaylight.mdsal.common.api.TransactionCommitFailedException: canCommit encountered an unexpected failure at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:42) ~[bundleFile:14.0.18] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:18) ~[bundleFile:14.0.18] at org.opendaylight.yangtools.util.concurrent.ExceptionMapper.apply(ExceptionMapper.java:98) ~[bundleFile:14.0.17] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.apply(TransactionCommitFailedExceptionMapper.java:37) ~[bundleFile:14.0.18] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker.handleException(ConcurrentDOMDataBroker.java:189) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker$1.onFailure(ConcurrentDOMDataBroker.java:133) ~[bundleFile:?] at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1117) ~[bundleFile:?] at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1004) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:767) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:516) ~[bundleFile:?] at com.google.common.util.concurrent.SettableFuture.setException(SettableFuture.java:54) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.lambda$directCommit$4(AbstractProxyTransaction.java:510) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) ~[?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) ~[?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) ~[?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) ~[?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) ~[?:?] Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-1-datastore-config-fe-0-chn-9-txn-0-1, sequence=21, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#929207398], modifications=0, protocol=SIMPLE} timed out after 120.021851264 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[bundleFile:?] ... 26 more 2025-10-14T00:59:58,295 | ERROR | CommitFutures-7 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error: TransactionCommitFailedException{message=canCommit encountered an unexpected failure, errorList=[RpcError [message=canCommit encountered an unexpected failure, severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-1-datastore-config-fe-0-chn-9-txn-0-1, sequence=21, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#929207398], modifications=0, protocol=SIMPLE} timed out after 120.021851264 seconds. The backend for inventory is not available.]]} in Datastore write operation: dpid: openflow:1, begin tableId: 0, end tableId: 1, sourceIp: 10001 2025-10-14T01:00:00,123 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:00:02,049 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:02,958 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:03,479 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:05,039 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:07,616 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:08,684 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:11,278 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:11,799 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:14,918 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:16,478 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:16,997 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:19,076 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:19,588 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:21,423 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:00:22,709 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:28,432 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:32,069 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:32,589 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:33,107 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:34,148 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:38,818 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:40,378 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:42,458 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:42,462 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:00:44,539 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:45,579 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:47,143 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:48,177 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:50,778 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:51,818 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:52,338 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:54,419 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:57,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:57,538 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:00:59,618 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:01,179 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:02,727 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:03,503 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:01:03,768 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:04,292 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:04,808 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:09,488 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:17,299 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:18,857 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:19,378 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:19,899 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:21,978 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:24,543 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:01:26,659 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:27,178 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:30,298 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:35,501 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:37,059 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:38,097 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:39,659 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:40,178 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:40,698 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:41,127 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:41,648 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:42,895 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:43,428 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:45,583 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:01:47,231 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:47,232 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.161:2550: 3014 millis 2025-10-14T01:01:48,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:49,689 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:50,208 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:52,809 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:53,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:53,848 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:54,368 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:54,887 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:55,930 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:56,450 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:58,529 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:01:59,048 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:00,088 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:00,557 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:02,168 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:03,226 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:05,290 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:05,807 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:06,623 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:02:06,859 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:08,939 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:11,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:11,538 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:12,577 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:13,049 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:15,698 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:16,218 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:17,261 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:17,777 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:19,337 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:19,820 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:20,377 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:21,418 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:21,938 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:22,977 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:24,537 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:25,577 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:26,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:27,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:27,647 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:27,663 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:02:29,184 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:30,242 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:32,798 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:34,388 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:37,511 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:39,081 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:40,108 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:40,627 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:41,147 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:43,752 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:45,307 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:46,868 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:48,709 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:02:49,471 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:50,508 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:51,026 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:51,547 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:52,048 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:55,176 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:56,221 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:57,258 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:57,778 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:58,239 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:58,818 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:02:59,338 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:00,377 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:01,417 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:02,889 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:03,926 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:09,744 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:03:09,750 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:11,828 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:13,390 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:15,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:16,507 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:18,067 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:21,110 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:21,710 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:22,747 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:23,788 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:24,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:24,827 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:29,508 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:30,783 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:03:36,164 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-topology-config#-2135773793] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3929] dead letters encountered, of which 3918 were not logged. The counter will be reset now. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,165 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-toaster-operational#1844318297] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [1] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,165 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-toaster-config#935385975] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [2] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,166 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,166 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#826717698] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,166 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#-987011148] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,166 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-topology-config#-2135773793] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,167 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-toaster-operational#1844318297] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,167 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#756004085] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,167 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-toaster-config#935385975] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,167 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:03:36,263 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:37,817 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:39,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:41,468 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:41,987 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:43,027 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:43,561 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:44,066 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:47,189 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:47,707 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:48,226 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:49,267 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:49,787 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:51,822 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:03:52,909 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:53,387 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:53,952 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:54,988 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:55,508 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:56,027 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:57,067 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:58,107 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:03:58,627 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:02,277 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:03,837 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:10,078 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:12,157 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:12,677 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:12,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:04:14,758 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:21,001 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:26,712 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:33,903 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:04:34,508 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:35,027 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:37,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:41,757 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:43,867 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:46,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:47,507 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:54,943 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:04:55,088 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:57,367 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:04:58,412 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:01,016 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:04,066 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:07,258 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:07,778 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:09,337 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:09,857 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:11,422 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:14,006 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:14,526 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:15,983 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:05:16,086 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:16,517 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:24,362 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:26,437 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:26,929 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:33,252 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:33,757 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:37,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:05:41,539 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:42,626 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart 2025-10-14T01:05:44,141 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:46,736 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:47,257 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:51,418 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:55,056 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:56,620 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:05:58,063 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:05:58,166 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:05,437 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:10,112 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:11,666 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:14,787 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:19,093 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:06:22,577 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:27,257 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:30,381 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:31,937 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:34,020 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:38,179 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:40,133 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:06:40,258 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:42,332 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:44,345 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:47,527 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:55,028 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:06:56,888 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:01,173 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:07:01,572 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:02,088 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:02,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:05,727 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:12,479 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:20,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:22,213 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:07:24,233 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart 2025-10-14T01:07:24,613 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node 2025-10-14T01:07:24,953 | INFO | epollEventLoopGroup-5-4 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.171.222:48642, NodeId:openflow:1 2025-10-14T01:07:24,954 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 disconnected. 2025-10-14T01:07:24,954 | INFO | epollEventLoopGroup-5-4 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-14T01:07:24,960 | INFO | epollEventLoopGroup-5-4 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node removed notification for Uri{value=openflow:1} 2025-10-14T01:07:24,961 | INFO | epollEventLoopGroup-5-4 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-14T01:07:24,961 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role SLAVE was granted to device openflow:1 2025-10-14T01:07:24,961 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2025-10-14T01:07:24,962 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2025-10-14T01:07:24,962 | INFO | epollEventLoopGroup-5-4 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-14T01:07:24,963 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2025-10-14T01:07:24,963 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2025-10-14T01:07:24,964 | INFO | epollEventLoopGroup-5-4 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services registration for node openflow:1 2025-10-14T01:07:24,964 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2025-10-14T01:07:24,964 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2025-10-14T01:07:24,964 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2025-10-14T01:07:24,964 | INFO | epollEventLoopGroup-5-4 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-14T01:07:24,964 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2025-10-14T01:07:24,965 | INFO | ofppool-1 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services for node openflow:1 2025-10-14T01:07:25,003 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-14T01:07:25,043 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-14T01:07:25,548 | INFO | node-cleaner-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-14T01:07:26,506 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:27,333 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node 2025-10-14T01:07:27,566 | INFO | qtp1873851465-722 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Ping Pong Flow Tester Impl 2025-10-14T01:07:27,566 | INFO | qtp1873851465-722 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Transaction Chain Flow Writer Impl 2025-10-14T01:07:27,567 | INFO | ForkJoinPool-10-worker-4 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Number of Txn for dpId: openflow:1 is: 1 2025-10-14T01:07:27,567 | INFO | ForkJoinPool-10-worker-4 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@69636bcf for dpid: openflow:1 2025-10-14T01:07:27,600 | INFO | ForkJoinPool-10-worker-4 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed FlowHandlerTask thread for dpid: openflow:1 2025-10-14T01:07:32,217 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:39,767 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:40,517 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:41,037 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:43,253 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:07:48,586 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:49,337 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:56,907 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:07:57,137 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:00,257 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:04,293 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:08:07,807 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:08,557 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:14,798 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:18,427 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:25,333 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:08:25,706 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:26,225 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:29,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:32,967 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:37,023 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-topology-config#-2135773793] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4095] dead letters encountered, of which 4084 were not logged. The counter will be reset now. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,023 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#756004085] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [1] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,023 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#826717698] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [2] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,023 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-toaster-config#935385975] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-topology-config#-2135773793] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#-987011148] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-toaster-operational#1844318297] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#826717698] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#756004085] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1853413801] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:08:37,126 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:41,287 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:42,326 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:46,373 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:08:50,117 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:52,707 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:53,217 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:53,737 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:58,397 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:08:58,916 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:06,516 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:07,413 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:09:08,336 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart 2025-10-14T01:09:10,348 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:10,770 | INFO | sshd-SshServer[441be407](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.171.102:48120 authenticated 2025-10-14T01:09:14,510 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot 2025-10-14T01:09:14,821 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables 2025-10-14T01:09:15,021 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:15,522 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart 2025-10-14T01:09:16,576 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:17,617 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:18,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:20,736 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:21,256 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:23,848 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:25,407 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-45 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:26,967 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:27,238 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1 2025-10-14T01:09:27,526 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower 2025-10-14T01:09:27,624 | INFO | CommitFutures-10 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed all flows installation for: dpid: openflow:1 in 1580819583280ns 2025-10-14T01:09:27,624 | ERROR | CommitFutures-10 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error: TransactionCommitFailedException{message=canCommit encountered an unexpected failure, errorList=[RpcError [message=canCommit encountered an unexpected failure, severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-1-datastore-config-fe-0-chn-11-txn-0-1, sequence=11, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#929207398], modifications=0, protocol=SIMPLE} timed out after 120.022408776 seconds. The backend for inventory is not available.]]} in Datastore write operation: dpid: openflow:1, begin tableId: 0, end tableId: 1, sourceIp: 10001 2025-10-14T01:09:27,624 | ERROR | CommitFutures-9 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@69636bcf FAILED due to: org.opendaylight.mdsal.common.api.TransactionCommitFailedException: canCommit encountered an unexpected failure at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:42) ~[bundleFile:14.0.18] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:18) ~[bundleFile:14.0.18] at org.opendaylight.yangtools.util.concurrent.ExceptionMapper.apply(ExceptionMapper.java:98) ~[bundleFile:14.0.17] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.apply(TransactionCommitFailedExceptionMapper.java:37) ~[bundleFile:14.0.18] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker.handleException(ConcurrentDOMDataBroker.java:189) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker$1.onFailure(ConcurrentDOMDataBroker.java:133) ~[bundleFile:?] at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1117) ~[bundleFile:?] at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1004) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:767) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:516) ~[bundleFile:?] at com.google.common.util.concurrent.SettableFuture.setException(SettableFuture.java:54) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.lambda$directCommit$4(AbstractProxyTransaction.java:510) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) ~[?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) ~[?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) ~[?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) ~[?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) ~[?:?] Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-1-datastore-config-fe-0-chn-11-txn-0-1, sequence=11, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#929207398], modifications=0, protocol=SIMPLE} timed out after 120.022408776 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[bundleFile:?] ... 26 more 2025-10-14T01:09:27,814 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster 2025-10-14T01:09:28,126 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart 2025-10-14T01:09:28,424 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes 2025-10-14T01:09:28,453 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-14T01:09:28,527 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-34 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused Oct 14, 2025 1:09:55 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.0/lock Oct 14, 2025 1:09:55 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Oct 14, 2025 1:09:55 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2025-10-14T01:09:56,104 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.logging]) | EventAdminConfigurationNotifier | 5 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.3.0 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2025-10-14T01:09:56,264 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.cm/1.3.2 has been started 2025-10-14T01:09:56,287 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2025-10-14T01:09:56,332 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2025-10-14T01:09:56,342 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3] for service with service.id [15] 2025-10-14T01:09:56,344 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3] for service with service.id [40] 2025-10-14T01:09:56,353 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Starting with globalExtender setting: false 2025-10-14T01:09:56,356 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Version = 2.2.6 2025-10-14T01:09:56,441 | INFO | activator-1-thread-1 | Activator | 113 - org.apache.karaf.management.server - 4.4.8 | Setting java.rmi.server.hostname system property to 127.0.0.1 2025-10-14T01:09:56,549 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@2428077b with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 2025-10-14T01:09:56,553 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@2428077b with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 2025-10-14T01:09:56,554 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@2428077b with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 2025-10-14T01:09:56,555 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@2428077b with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 2025-10-14T01:09:56,555 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@2428077b with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 2025-10-14T01:09:56,557 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@2428077b with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 2025-10-14T01:09:56,557 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@2428077b with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 2025-10-14T01:09:56,567 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2025-10-14T01:09:56,661 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.8 2025-10-14T01:09:56,669 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.config.command/4.4.8 2025-10-14T01:09:56,718 | INFO | activator-1-thread-2 | Activator | 99 - org.apache.karaf.deployer.features - 4.4.8 | Deployment finished. Registering FeatureDeploymentListener 2025-10-14T01:09:56,724 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.8 2025-10-14T01:09:56,725 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.8 2025-10-14T01:09:56,737 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.features.command/4.4.8 2025-10-14T01:09:56,750 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.8. Missing service: [org.apache.karaf.http.core.ProxyService] 2025-10-14T01:09:56,756 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.instance.core/4.4.8 2025-10-14T01:09:56,765 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-14T01:09:56,766 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-14T01:09:56,767 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-14T01:09:56,769 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.kar.core/4.4.8 2025-10-14T01:09:56,772 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.log.core/4.4.8 2025-10-14T01:09:56,774 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.package.core/4.4.8 2025-10-14T01:09:56,775 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.service.core/4.4.8 2025-10-14T01:09:56,781 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-14T01:09:56,782 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-14T01:09:56,784 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | Activator | 120 - org.apache.karaf.shell.core - 4.4.8 | Not starting local console. To activate set karaf.startLocalConsole=true 2025-10-14T01:09:56,813 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.8 has been started 2025-10-14T01:09:56,842 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.8. Missing service: [org.apache.sshd.server.SshServer] 2025-10-14T01:09:56,857 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.system.core/4.4.8 2025-10-14T01:09:56,871 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.8. Missing service: [org.apache.karaf.web.WebContainerService] 2025-10-14T01:09:56,906 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | Activator | 392 - org.ops4j.pax.web.pax-web-extender-war - 8.0.33 | Configuring WAR extender thread pool. Pool size = 3 2025-10-14T01:09:57,089 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | Activator | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.33 | Starting Pax Web Whiteboard Extender 2025-10-14T01:09:57,123 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.8 2025-10-14T01:09:57,139 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 125 - org.apache.sshd.osgi - 2.15.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2025-10-14T01:09:57,161 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | log | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @3219ms to org.eclipse.jetty.util.log.Slf4jLog 2025-10-14T01:09:57,194 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because configuration has changed 2025-10-14T01:09:57,194 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2025-10-14T01:09:57,195 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Pax Web Runtime started 2025-10-14T01:09:57,197 | INFO | paxweb-config-3-thread-1 (change config) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2025-10-14T01:09:57,214 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Starting BlueprintBundleTracker 2025-10-14T01:09:57,222 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.8 [120] was successfully created 2025-10-14T01:09:57,222 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [78] was successfully created 2025-10-14T01:09:57,222 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [79] was successfully created 2025-10-14T01:09:57,273 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-14T01:09:57,274 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Configuring JettyServerController{configuration=1929c8f4-2b7e-4012-b97b-cc041f1667ad,state=UNCONFIGURED} 2025-10-14T01:09:57,274 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating Jetty server instance using configuration properties. 2025-10-14T01:09:57,290 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Processing Jetty configuration from files: [etc/jetty.xml] 2025-10-14T01:09:57,436 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Found configured connector "jetty-default": 0.0.0.0:8181 2025-10-14T01:09:57,437 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Using configured jetty-default@2356c2a1{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2025-10-14T01:09:57,438 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp617931180]@24d4e1ac{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2025-10-14T01:09:57,441 | INFO | paxweb-config-3-thread-1 (change controller) | JettyFactory | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding JMX support to Jetty server 2025-10-14T01:09:57,468 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.8 2025-10-14T01:09:57,481 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-14T01:09:57,491 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting JettyServerController{configuration=1929c8f4-2b7e-4012-b97b-cc041f1667ad,state=STOPPED} 2025-10-14T01:09:57,491 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Server@7d3789ed{STOPPED}[9.4.57.v20241219] 2025-10-14T01:09:57,492 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.8+9-Ubuntu-0ubuntu122.04.1 2025-10-14T01:09:57,513 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2025-10-14T01:09:57,513 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2025-10-14T01:09:57,514 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 660000ms 2025-10-14T01:09:57,551 | INFO | paxweb-config-3-thread-1 (change controller) | AbstractConnector | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@2356c2a1{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2025-10-14T01:09:57,551 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @3617ms 2025-10-14T01:09:57,553 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpService factory 2025-10-14T01:09:57,554 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.8 [105]] 2025-10-14T01:09:57,575 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.8 [124]] 2025-10-14T01:09:57,580 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.http.core/4.4.8 2025-10-14T01:09:57,587 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.web.core/4.4.8 2025-10-14T01:09:57,591 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.33 [392]] 2025-10-14T01:09:57,593 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [155]] 2025-10-14T01:09:57,601 | INFO | paxweb-config-3-thread-1 (change controller) | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-4,contextPath='/'} 2025-10-14T01:09:57,602 | INFO | paxweb-config-3-thread-1 (change controller) | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@4bcf41f5,contexts=[{HS,OCM-5,context:314133252,/}]} 2025-10-14T01:09:57,603 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@4bcf41f5,contexts=null}", size=4} 2025-10-14T01:09:57,603 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-4,contextPath='/'} 2025-10-14T01:09:57,605 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.33 [393]] 2025-10-14T01:09:57,629 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:09:57,635 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{HS,id=OCM-5,name='context:314133252',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:314133252',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@12b94b04}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@63f253{/,null,STOPPED} 2025-10-14T01:09:57,638 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@63f253{/,null,STOPPED} 2025-10-14T01:09:57,639 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@4bcf41f5,contexts=[{HS,OCM-5,context:314133252,/}]} 2025-10-14T01:09:57,647 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CustomFilterAdapterConfigurationImpl | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg, component.id=4, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2025-10-14T01:09:57,652 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:314133252',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:314133252',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@12b94b04}} 2025-10-14T01:09:57,671 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.AuthenticationService), (objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:09:57,685 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.21.2 [172]] 2025-10-14T01:09:57,686 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BundleWhiteboardApplication | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.33 | No matching target context(s) for Whiteboard element ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[]}. Filter: (osgi.http.whiteboard.context.name=default). Element may be re-registered later, when matching context/s is/are registered. 2025-10-14T01:09:57,688 | INFO | paxweb-config-3-thread-1 (change controller) | osgi | 155 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2025-10-14T01:09:57,696 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:09:57,726 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:09:57,737 | INFO | paxweb-config-3-thread-1 (change controller) | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@63f253{/,null,AVAILABLE} 2025-10-14T01:09:57,738 | INFO | paxweb-config-3-thread-1 (change controller) | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:314133252',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:314133252',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@12b94b04}}} as OSGi service for "/" context path 2025-10-14T01:09:57,742 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpServiceRuntime 2025-10-14T01:09:57,748 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=1} 2025-10-14T01:09:57,749 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@63f253{/,null,AVAILABLE} 2025-10-14T01:09:57,752 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-14T01:09:57,752 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2025-10-14T01:09:57,753 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-14T01:09:57,791 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | FileAkkaConfigurationReader | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | File-based Pekko configuration reader enabled 2025-10-14T01:09:57,825 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider starting 2025-10-14T01:09:58,000 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | ActorSystemProviderImpl | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating new ActorSystem 2025-10-14T01:09:58,344 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Slf4jLogger started 2025-10-14T01:09:58,576 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.171.168:2550] with UID [-1057887249814900388] 2025-10-14T01:09:58,586 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Starting up, Pekko version [1.0.3] ... 2025-10-14T01:09:58,634 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2025-10-14T01:09:58,636 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Started up successfully 2025-10-14T01:09:58,674 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.171.168:2550#-1057887249814900388], selfDc [default]. 2025-10-14T01:09:59,021 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider started 2025-10-14T01:09:59,029 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | FileModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Shard configuration provider started 2025-10-14T01:09:59,055 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.7. Missing service: [org.opendaylight.infrautils.diagstatus.DiagStatusServiceMBean] 2025-10-14T01:09:59,086 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:59,086 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], control stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:59,097 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.116:2550], control stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.116/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:59,098 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:09:59,287 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | ThreadFactory for SystemReadyService created 2025-10-14T01:09:59,289 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2025-10-14T01:09:59,291 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DiagStatusServiceImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service started 2025-10-14T01:09:59,293 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos() started... 2025-10-14T01:09:59,298 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | MBeanUtils | 198 - org.opendaylight.infrautils.diagstatus-api - 7.1.7 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2025-10-14T01:09:59,298 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DiagStatusServiceMBeanImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service management started 2025-10-14T01:09:59,299 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.7 2025-10-14T01:09:59,328 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.mastership.MastershipChangeServiceManager), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:09:59,344 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:09:59,352 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2025-10-14T01:09:59,358 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.NotificationService)] 2025-10-14T01:09:59,405 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory)] 2025-10-14T01:09:59,412 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:09:59,412 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:09:59,413 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:09:59,417 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | ReconciliationManager started 2025-10-14T01:09:59,417 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1 2025-10-14T01:09:59,418 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:09:59,422 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | MessageIntelligenceAgencyImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2025-10-14T01:09:59,424 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.20.1 2025-10-14T01:09:59,455 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:09:59,456 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:09:59,457 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OpenflowServiceRecoveryHandlerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Registering openflowplugin service recovery handlers 2025-10-14T01:09:59,461 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1. Missing service: [org.opendaylight.serviceutils.srm.spi.RegistryControl, org.opendaylight.mdsal.binding.api.DataBroker] 2025-10-14T01:09:59,466 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | SimpleBindingDOMCodecFactory | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Binding/DOM Codec enabled 2025-10-14T01:09:59,471 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activating 2025-10-14T01:09:59,473 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activated 2025-10-14T01:09:59,480 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DefaultBindingRuntimeGenerator | 328 - org.opendaylight.yangtools.binding-generator - 14.0.17 | Binding/YANG type support activated 2025-10-14T01:09:59,488 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activating 2025-10-14T01:09:59,488 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activated 2025-10-14T01:09:59,556 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime starting 2025-10-14T01:09:59,577 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | KarafFeaturesSupport | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Will attempt to integrate with Karaf FeaturesService 2025-10-14T01:10:00,028 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 9.0.1 | Netty transport backed by epoll(2) 2025-10-14T01:10:00,240 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | SharedEffectiveModelContextFactory | 379 - org.opendaylight.yangtools.yang-parser-impl - 14.0.17 | Using weak references 2025-10-14T01:10:00,301 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#919871336]], but this node is not initialized yet 2025-10-14T01:10:00,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon#585610575]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T01:10:00,325 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon#585610575]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T01:10:00,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#613429641]], but this node is not initialized yet 2025-10-14T01:10:00,391 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon#-801104702]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T01:10:00,411 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] is JOINING itself (with roles [member-1, dc-default], version [0.0.0]) and forming new cluster 2025-10-14T01:10:00,414 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-10-14T01:10:00,423 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Up] 2025-10-14T01:10:00,442 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-10-14T01:10:02,252 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiModuleInfoSnapshotImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | EffectiveModelContext generation 1 activated 2025-10-14T01:10:02,252 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | DOM Schema services activated 2025-10-14T01:10:02,253 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | Updating context to generation 1 2025-10-14T01:10:02,257 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DOMRpcRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM RPC/Action router started 2025-10-14T01:10:02,263 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service starting 2025-10-14T01:10:02,265 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service started 2025-10-14T01:10:02,359 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-31 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage MAPPED 2025-10-14T01:10:03,038 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiBindingRuntimeContextImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | BindingRuntimeContext generation 1 activated 2025-10-14T01:10:03,057 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiBindingDOMCodecServicesImpl | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec generation 1 activated 2025-10-14T01:10:03,057 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiDatastoreContextIntrospectorFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore Context Introspector activated 2025-10-14T01:10:03,060 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION starting 2025-10-14T01:10:03,317 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : config 2025-10-14T01:10:03,319 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T01:10:03,319 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T01:10:03,325 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-config 2025-10-14T01:10:03,341 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-config 2025-10-14T01:10:03,346 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Recovery complete 2025-10-14T01:10:03,367 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.168:2550 2025-10-14T01:10:03,367 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-14T01:10:03,367 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-14T01:10:03,367 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-14T01:10:03,368 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-14T01:10:03,416 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store config is using tell-based protocol 2025-10-14T01:10:03,420 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T01:10:03,421 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-14T01:10:03,421 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-14T01:10:03,422 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL starting 2025-10-14T01:10:03,424 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : operational 2025-10-14T01:10:03,424 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-operational 2025-10-14T01:10:03,425 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-14T01:10:03,426 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-14T01:10:03,426 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-14T01:10:03,430 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-operational 2025-10-14T01:10:03,436 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-topology-config: Shard created, persistent : true 2025-10-14T01:10:03,437 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store operational is using tell-based protocol 2025-10-14T01:10:03,438 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: Shard created, persistent : true 2025-10-14T01:10:03,441 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | GlobalBindingDOMCodecServices | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Global Binding/DOM Codec activated with generation 1 2025-10-14T01:10:03,445 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-toaster-config: Shard created, persistent : true 2025-10-14T01:10:03,447 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Recovery complete 2025-10-14T01:10:03,448 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.168:2550 2025-10-14T01:10:03,448 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-14T01:10:03,448 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-14T01:10:03,448 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-14T01:10:03,449 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-14T01:10:03,450 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-14T01:10:03,450 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-14T01:10:03,450 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiBlockingBindingNormalizer | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter activated 2025-10-14T01:10:03,451 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-14T01:10:03,452 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-14T01:10:03,453 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: Shard created, persistent : false 2025-10-14T01:10:03,453 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-topology-operational: Shard created, persistent : false 2025-10-14T01:10:03,453 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-operational: Shard created, persistent : false 2025-10-14T01:10:03,453 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Shard created, persistent : true 2025-10-14T01:10:03,454 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-toaster-operational: Shard created, persistent : false 2025-10-14T01:10:03,461 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for MountPointService activated 2025-10-14T01:10:03,473 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DOMNotificationRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM Notification Router started 2025-10-14T01:10:03,477 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService)] 2025-10-14T01:10:03,477 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationService activated 2025-10-14T01:10:03,481 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService)] 2025-10-14T01:10:03,481 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-topology-config/member-1-shard-topology-config-notifier#-729916022 created and ready for shard:member-1-shard-topology-config 2025-10-14T01:10:03,481 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-default-operational/member-1-shard-default-operational-notifier#-324288518 created and ready for shard:member-1-shard-default-operational 2025-10-14T01:10:03,481 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Starting recovery with journal batch size 1 2025-10-14T01:10:03,482 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationPublishService activated 2025-10-14T01:10:03,483 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Starting recovery with journal batch size 1 2025-10-14T01:10:03,483 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:10:03,483 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:10:03,484 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcService activated 2025-10-14T01:10:03,485 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T01:10:03,489 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcProviderService activated 2025-10-14T01:10:03,492 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-toaster-operational/member-1-shard-toaster-operational-notifier#-1732784911 created and ready for shard:member-1-shard-toaster-operational 2025-10-14T01:10:03,492 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Starting recovery with journal batch size 1 2025-10-14T01:10:03,493 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-43 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage DISK 2025-10-14T01:10:03,493 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-inventory-operational/member-1-shard-inventory-operational-notifier#1754162803 created and ready for shard:member-1-shard-inventory-operational 2025-10-14T01:10:03,494 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-default-config/member-1-shard-default-config-notifier#-1259126490 created and ready for shard:member-1-shard-default-config 2025-10-14T01:10:03,494 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Starting recovery with journal batch size 1 2025-10-14T01:10:03,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-toaster-config/member-1-shard-toaster-config-notifier#-1887224326 created and ready for shard:member-1-shard-toaster-config 2025-10-14T01:10:03,496 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Starting recovery with journal batch size 1 2025-10-14T01:10:03,496 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Starting recovery with journal batch size 1 2025-10-14T01:10:03,497 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Starting recovery with journal batch size 1 2025-10-14T01:10:03,497 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-config/member-1-shard-inventory-config/member-1-shard-inventory-config-notifier#-775691583 created and ready for shard:member-1-shard-inventory-config 2025-10-14T01:10:03,498 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.171.168:2550/user/shardmanager-operational/member-1-shard-topology-operational/member-1-shard-topology-operational-notifier#187480210 created and ready for shard:member-1-shard-topology-operational 2025-10-14T01:10:03,498 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Starting recovery with journal batch size 1 2025-10-14T01:10:03,538 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton manager starting singleton actor [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-14T01:10:03,539 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | ClusterSingletonManager state change [Start -> Oldest] 2025-10-14T01:10:03,563 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-14T01:10:03,564 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:10:03,567 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionService activated 2025-10-14T01:10:03,569 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionProviderService activated 2025-10-14T01:10:03,570 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | DynamicBindingAdapter | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | 8 DOMService trackers started 2025-10-14T01:10:03,571 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-14T01:10:03,571 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-14T01:10:03,583 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | GlobalBindingRuntimeContext | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Global BindingRuntimeContext generation 1 activated 2025-10-14T01:10:03,584 | INFO | Start Level: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime started 2025-10-14T01:10:03,611 | INFO | Framework Event Dispatcher: Equinox Container: 3ea1ce11-c447-4ec7-9a55-a9b7a8500be3 | Main | 4 - org.ops4j.pax.logging.pax-logging-api - 2.3.0 | Karaf started in 8s. Bundle stats: 397 active, 398 total 2025-10-14T01:10:03,650 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: journal open: applyTo=0 2025-10-14T01:10:03,650 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: journal open: applyTo=0 2025-10-14T01:10:03,651 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: journal open: applyTo=0 2025-10-14T01:10:03,651 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: journal open: applyTo=0 2025-10-14T01:10:03,651 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: journal open: applyTo=0 2025-10-14T01:10:03,652 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: journal open: applyTo=0 2025-10-14T01:10:03,660 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: journal open: applyTo=95 2025-10-14T01:10:03,671 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T01:10:03,671 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T01:10:03,686 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | Recovery snapshot applied for member-1-shard-topology-operational in 6.420 μs: snapshotIndex=-1, snapshotTerm=-1, journal-size=0 2025-10-14T01:10:03,686 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Recovery completed in 1.063 ms - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T01:10:03,687 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | Recovery snapshot applied for member-1-shard-default-operational in 2.400 μs: snapshotIndex=-1, snapshotTerm=-1, journal-size=0 2025-10-14T01:10:03,687 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T01:10:03,687 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Recovery completed in 2.504 ms - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T01:10:03,688 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , received role change from null to Follower 2025-10-14T01:10:03,688 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T01:10:03,689 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from null to Follower 2025-10-14T01:10:03,689 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T01:10:03,689 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , received role change from null to Follower 2025-10-14T01:10:03,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T01:10:03,683 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from null to Follower 2025-10-14T01:10:03,687 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from null to Follower 2025-10-14T01:10:03,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T01:10:03,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T01:10:03,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T01:10:03,690 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-default-operational from null to Follower 2025-10-14T01:10:03,690 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-topology-config from null to Follower 2025-10-14T01:10:03,691 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-toaster-config from null to Follower 2025-10-14T01:10:03,691 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from null to Follower 2025-10-14T01:10:03,691 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from null to Follower 2025-10-14T01:10:03,695 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-14T01:10:03,695 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from null to Follower 2025-10-14T01:10:03,696 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-14T01:10:03,696 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from null to Follower 2025-10-14T01:10:03,751 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , received role change from null to Follower 2025-10-14T01:10:03,751 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T01:10:03,752 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-default-config from null to Follower 2025-10-14T01:10:03,772 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: journal open: applyTo=40009 2025-10-14T01:10:04,317 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | Recovery snapshot applied for member-1-shard-inventory-config in 381.5 ms: snapshotIndex=39941, snapshotTerm=4, journal-size=0 2025-10-14T01:10:04,317 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Recovery completed in 382.1 ms - Switching actor to Follower - last log index = 39941, last log term = 4, snapshot index = 39941, snapshot term = 4, journal size = 0 2025-10-14T01:10:04,334 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from null to Follower 2025-10-14T01:10:04,334 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-14T01:10:04,335 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from null to Follower 2025-10-14T01:10:04,534 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton identified at [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-14T01:10:12,410 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#613429641]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T01:10:12,410 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#613429641]] (version [1.0.3]) 2025-10-14T01:10:12,457 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#919871336]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T01:10:12,457 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.116:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#919871336]] (version [1.0.3]) 2025-10-14T01:10:12,494 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.161:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2025-10-14T01:10:12,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#1193261120] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:10:12,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1841838325] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:10:12,513 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.116:2550] is JOINING, roles [member-2, dc-default], version [0.0.0] 2025-10-14T01:10:12,514 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#1193261120] was unhandled. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:10:12,515 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1841838325] was unhandled. [4] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:10:12,955 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.116:2550] to [Up] 2025-10-14T01:10:12,955 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.161:2550] to [Up] 2025-10-14T01:10:12,956 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T01:10:12,956 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-14T01:10:12,956 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.116:2550 2025-10-14T01:10:12,957 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-14T01:10:12,957 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-14T01:10:12,957 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-10-14T01:10:12,957 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-14T01:10:12,957 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-14T01:10:12,958 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-14T01:10:12,958 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-14T01:10:12,958 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-14T01:10:12,958 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-14T01:10:12,959 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-14T01:10:12,959 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-10-14T01:10:12,959 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-14T01:10:12,959 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is not the leader any more and not responsible for taking SBR decisions. 2025-10-14T01:10:12,960 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:10:12,960 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-14T01:10:12,960 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T01:10:12,960 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-14T01:10:12,961 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:10:12,961 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T01:10:12,961 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T01:10:12,962 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T01:10:12,962 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T01:10:12,962 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T01:10:12,963 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T01:10:12,963 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T01:10:12,963 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T01:10:12,961 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T01:10:12,964 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T01:10:12,964 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T01:10:12,964 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T01:10:12,964 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T01:10:12,964 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T01:10:12,959 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.116:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-14T01:10:12,965 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T01:10:13,713 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Candidate): Starting new election term 6 2025-10-14T01:10:13,714 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 6 2025-10-14T01:10:13,715 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Follower to Candidate 2025-10-14T01:10:13,715 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Follower to Candidate 2025-10-14T01:10:13,757 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 6 2025-10-14T01:10:13,758 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Candidate to Leader 2025-10-14T01:10:13,758 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@64a6d7dd 2025-10-14T01:10:13,759 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Candidate to Leader 2025-10-14T01:10:13,766 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Candidate): Starting new election term 4 2025-10-14T01:10:13,767 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-14T01:10:13,767 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Candidate): Starting new election term 4 2025-10-14T01:10:13,767 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Follower to Candidate 2025-10-14T01:10:13,768 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-14T01:10:13,768 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Follower to Candidate 2025-10-14T01:10:13,768 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Follower to Candidate 2025-10-14T01:10:13,770 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Follower to Candidate 2025-10-14T01:10:13,776 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Candidate): Starting new election term 4 2025-10-14T01:10:13,776 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Candidate): Starting new election term 4 2025-10-14T01:10:13,776 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-14T01:10:13,777 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-14T01:10:13,778 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Follower to Candidate 2025-10-14T01:10:13,778 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , received role change from Follower to Candidate 2025-10-14T01:10:13,779 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-default-config from Follower to Candidate 2025-10-14T01:10:13,779 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-14T01:10:13,779 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-14T01:10:13,779 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Follower to Candidate 2025-10-14T01:10:13,780 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Candidate to Leader 2025-10-14T01:10:13,779 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Candidate to Leader 2025-10-14T01:10:13,780 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@64967f08 2025-10-14T01:10:13,780 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Candidate to Leader 2025-10-14T01:10:13,780 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@410e60df 2025-10-14T01:10:13,781 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Candidate to Leader 2025-10-14T01:10:13,786 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Candidate): Starting new election term 5 2025-10-14T01:10:13,786 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2025-10-14T01:10:13,786 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Follower to Candidate 2025-10-14T01:10:13,787 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-topology-config from Follower to Candidate 2025-10-14T01:10:13,793 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-14T01:10:13,794 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4e231d18 2025-10-14T01:10:13,794 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Candidate to Leader 2025-10-14T01:10:13,794 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Candidate to Leader 2025-10-14T01:10:13,795 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-14T01:10:13,796 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@37ed4bd0 2025-10-14T01:10:13,796 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Candidate): Starting new election term 4 2025-10-14T01:10:13,797 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-14T01:10:13,797 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Follower to Candidate 2025-10-14T01:10:13,797 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Follower to Candidate 2025-10-14T01:10:13,797 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-default-config , received role change from Candidate to Leader 2025-10-14T01:10:13,798 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-default-config from Candidate to Leader 2025-10-14T01:10:13,798 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 5 2025-10-14T01:10:13,799 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Candidate to Leader 2025-10-14T01:10:13,799 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@410cc4cc 2025-10-14T01:10:13,799 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-topology-config from Candidate to Leader 2025-10-14T01:10:13,807 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-14T01:10:13,807 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Candidate to Leader 2025-10-14T01:10:13,807 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@545faebc 2025-10-14T01:10:13,807 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Candidate to Leader 2025-10-14T01:10:13,808 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T01:10:13,810 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type OPERATIONAL activated 2025-10-14T01:10:13,811 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL started 2025-10-14T01:10:13,974 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - is no longer leader 2025-10-14T01:10:14,438 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate): Starting new election term 5 2025-10-14T01:10:14,439 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2025-10-14T01:10:14,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Follower to Candidate 2025-10-14T01:10:14,440 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Follower to Candidate 2025-10-14T01:10:14,444 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate): Term 70 in "RequestVoteReply{term=70, voteGranted=false}" message is greater than Candidate's term 5 - switching to Follower 2025-10-14T01:10:14,448 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 70 2025-10-14T01:10:14,448 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Candidate to Follower 2025-10-14T01:10:14,448 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Candidate to Follower 2025-10-14T01:10:16,737 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Term 71 in "RequestVote{term=71, candidateId=member-3-shard-inventory-config, lastLogIndex=112611, lastLogTerm=4}" message is greater than follower's term 70 - updating term 2025-10-14T01:10:16,758 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2283c403 2025-10-14T01:10:16,759 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T01:10:16,759 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2025-10-14T01:10:16,761 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type CONFIGURATION activated 2025-10-14T01:10:16,765 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2025-10-14T01:10:16,778 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiClusterAdmin | 193 - org.opendaylight.controller.sal-cluster-admin-impl - 11.0.2 | Cluster Admin services started 2025-10-14T01:10:16,785 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ConcurrentDOMDataBroker | 358 - org.opendaylight.yangtools.util - 14.0.17 | ThreadFactory created: CommitFutures 2025-10-14T01:10:16,787 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | DataBrokerCommitExecutor | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker commit exector started 2025-10-14T01:10:16,789 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ConcurrentDOMDataBroker | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker started 2025-10-14T01:10:16,793 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T01:10:16,793 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for DataBroker activated 2025-10-14T01:10:16,851 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, nanosAgo=3053446310, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1} 2025-10-14T01:10:16,857 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, nanosAgo=3058991034, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1} 2025-10-14T01:10:16,859 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#2023042842], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-10-14T01:10:16,860 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#2023042842], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-10-14T01:10:16,873 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#2023042842], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 13.76 ms 2025-10-14T01:10:16,955 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-14T01:10:17,009 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiPasswordServiceConfigBootstrap | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | Listening for password service configuration 2025-10-14T01:10:17,010 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T01:10:17,021 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T01:10:17,026 | ERROR | opendaylight-cluster-data-notification-dispatcher-49 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | bundle org.opendaylight.aaa.idm-store-h2:0.21.2 (167)[org.opendaylight.aaa.datastore.h2.H2Store(5)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2025-10-14T01:10:17,033 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default iteration count=20000 2025-10-14T01:10:17,033 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2025-10-14T01:10:17,034 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2025-10-14T01:10:17,055 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | H2 IDMStore activated 2025-10-14T01:10:17,058 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T01:10:17,059 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T01:10:17,078 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-10-14T01:10:17,100 | INFO | Blueprint Extender: 2 | AaaCertMdsalProvider | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCertMdsalProvider Initialized 2025-10-14T01:10:17,105 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-10-14T01:10:17,129 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | EOSClusterSingletonServiceProvider | 257 - org.opendaylight.mdsal.mdsal-singleton-impl - 14.0.18 | Cluster Singleton Service started 2025-10-14T01:10:17,141 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | ietf-yang-library writer registered 2025-10-14T01:10:17,171 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ArbitratorReconciliationManagerImpl | 296 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.20.1 | ArbitratorReconciliationManager has started successfully. 2025-10-14T01:10:17,177 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1086354548], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-10-14T01:10:17,177 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1086354548], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-10-14T01:10:17,180 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1086354548], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 2.300 ms 2025-10-14T01:10:17,193 | INFO | opendaylight-cluster-data-notification-dispatcher-45 | AAAEncryptionServiceImpl | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | AAAEncryptionService activated 2025-10-14T01:10:17,193 | INFO | opendaylight-cluster-data-notification-dispatcher-45 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | Encryption Service enabled 2025-10-14T01:10:17,208 | INFO | Blueprint Extender: 2 | LazyBindingList | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Using lazy population for lists larger than 16 element(s) 2025-10-14T01:10:17,218 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), Initial app config ForwardingRulesManagerConfig] 2025-10-14T01:10:17,239 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-14T01:10:17,239 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-14T01:10:17,244 | INFO | Blueprint Extender: 2 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | Certificate Manager service has been initialized 2025-10-14T01:10:17,248 | INFO | Blueprint Extender: 2 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCert Rpc Service has been initialized 2025-10-14T01:10:17,264 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 has been started 2025-10-14T01:10:17,264 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.cert_0.21.2 [163] was successfully created 2025-10-14T01:10:17,266 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | DeviceOwnershipService started 2025-10-14T01:10:17,286 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Checking if default entries must be created in IDM store 2025-10-14T01:10:17,308 | INFO | Blueprint Extender: 3 | LLDPSpeaker | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2025-10-14T01:10:17,352 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | DefaultConfigPusher | 301 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.20.1 | DefaultConfigPusher has started. 2025-10-14T01:10:17,354 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-14T01:10:17,364 | INFO | Blueprint Extender: 3 | NodeConnectorInventoryEventTranslator | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | NodeConnectorInventoryEventTranslator has started. 2025-10-14T01:10:17,368 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 has been started 2025-10-14T01:10:17,368 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-14T01:10:17,369 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.20.1 [300] was successfully created 2025-10-14T01:10:17,406 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.1 | Topology Manager service started. 2025-10-14T01:10:17,480 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2025-10-14T01:10:17,484 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | rpc-requests-quota configuration property was changed to '20000' 2025-10-14T01:10:17,489 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | global-notification-quota configuration property was changed to '64000' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | switch-features-mandatory configuration property was changed to 'false' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | enable-flow-removed-notification configuration property was changed to 'true' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-rpc-enabled configuration property was changed to 'false' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-count-limit configuration property was changed to '25600' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-interval-timeout-limit configuration property was changed to '500' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | echo-reply-timeout configuration property was changed to '2000' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-polling-on configuration property was changed to 'true' 2025-10-14T01:10:17,490 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-table-statistics-polling-on configuration property was changed to 'true' 2025-10-14T01:10:17,491 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-flow-statistics-polling-on configuration property was changed to 'true' 2025-10-14T01:10:17,491 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-group-statistics-polling-on configuration property was changed to 'true' 2025-10-14T01:10:17,491 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-meter-statistics-polling-on configuration property was changed to 'true' 2025-10-14T01:10:17,493 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-port-statistics-polling-on configuration property was changed to 'true' 2025-10-14T01:10:17,493 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-queue-statistics-polling-on configuration property was changed to 'true' 2025-10-14T01:10:17,493 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | skip-table-features configuration property was changed to 'true' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | basic-timer-delay configuration property was changed to '3000' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | maximum-timer-delay configuration property was changed to '900000' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | use-single-layer-serialization configuration property was changed to 'true' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-min-threads configuration property was changed to '1' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-max-threads configuration property was changed to '32000' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-timeout configuration property was changed to '60' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-rate-limit-per-min configuration property was changed to '0' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-datastore-removal-delay configuration property was changed to '500' 2025-10-14T01:10:17,494 | INFO | Blueprint Extender: 3 | OSGiConfigurationServiceFactory | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2025-10-14T01:10:17,501 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | MD-SAL configuration-based SwitchConnectionProviders started 2025-10-14T01:10:17,505 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2025-10-14T01:10:17,525 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg' 2025-10-14T01:10:17,526 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2025-10-14T01:10:17,584 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@18e2b028 was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-14T01:10:17,592 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1 2025-10-14T01:10:17,666 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2025-10-14T01:10:17,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] already present 2025-10-14T01:10:17,679 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2025-10-14T01:10:17,679 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] already present 2025-10-14T01:10:17,683 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION started 2025-10-14T01:10:17,684 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Found default domain in IDM store, skipping insertion of default data 2025-10-14T01:10:17,687 | INFO | Blueprint Extender: 1 | AAAShiroProvider | 172 - org.opendaylight.aaa.shiro - 0.21.2 | AAAShiroProvider Session Initiated 2025-10-14T01:10:17,747 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2025-10-14T01:10:17,763 | INFO | Blueprint Extender: 2 | ForwardingRulesManagerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | ForwardingRulesManager has started successfully. 2025-10-14T01:10:17,765 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 has been started 2025-10-14T01:10:17,766 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.20.1 [299] was successfully created 2025-10-14T01:10:17,805 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@19fff02 was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-14T01:10:17,828 | INFO | Blueprint Extender: 1 | IniSecurityManagerFactory | 171 - org.opendaylight.aaa.repackaged-shiro - 0.21.2 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2025-10-14T01:10:17,853 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2025-10-14T01:10:17,856 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | LLDPDiscoveryListener started. 2025-10-14T01:10:17,858 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 has been started 2025-10-14T01:10:17,861 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.20.1 [303] was successfully created 2025-10-14T01:10:17,887 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-10-14T01:10:17,887 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2025-10-14T01:10:17,887 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-10-14T01:10:17,888 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@6bcb9484{/auth,null,STOPPED} 2025-10-14T01:10:17,889 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@6bcb9484{/auth,null,STOPPED} 2025-10-14T01:10:17,892 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-14T01:10:17,893 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-10-14T01:10:17,894 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-14T01:10:17,894 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.aaa.shiro_0.21.2 [172] registered context path /auth with 4 service(s) 2025-10-14T01:10:17,895 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2025-10-14T01:10:17,897 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-14T01:10:17,898 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-14T01:10:17,898 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@6bcb9484{/auth,null,AVAILABLE} 2025-10-14T01:10:17,898 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2025-10-14T01:10:17,899 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T01:10:17,901 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-14T01:10:17,902 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-10-14T01:10:17,902 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-14T01:10:17,903 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T01:10:17,903 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-14T01:10:17,903 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=1} 2025-10-14T01:10:17,903 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-14T01:10:17,908 | ERROR | Blueprint Extender: 1 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 9.0.1 | bundle org.opendaylight.netconf.restconf-server-mdsal:9.0.1 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(69)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2025-10-14T01:10:17,938 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: Store Tx member-1-datastore-operational-fe-1-txn-2-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-10-14T01:10:17,985 | INFO | Blueprint Extender: 1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278]] 2025-10-14T01:10:17,986 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-10-14T01:10:17,987 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=310, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2025-10-14T01:10:17,987 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-10-14T01:10:17,987 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=310, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@605f1964{/rests,null,STOPPED} 2025-10-14T01:10:17,988 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@605f1964{/rests,null,STOPPED} 2025-10-14T01:10:17,988 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-14T01:10:17,989 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-10-14T01:10:17,989 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-14T01:10:17,989 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T01:10:17,989 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=310, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2025-10-14T01:10:17,989 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /rests with 4 service(s) 2025-10-14T01:10:17,990 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-14T01:10:17,990 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-14T01:10:17,990 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@605f1964{/rests,null,AVAILABLE} 2025-10-14T01:10:17,990 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=310, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2025-10-14T01:10:17,991 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T01:10:17,991 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-14T01:10:17,991 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-10-14T01:10:17,991 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-14T01:10:17,991 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T01:10:17,992 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T01:10:17,992 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-14T01:10:17,992 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=1} 2025-10-14T01:10:17,992 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-14T01:10:17,992 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2025-10-14T01:10:17,992 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /.well-known with 3 service(s) 2025-10-14T01:10:17,992 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=314, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2025-10-14T01:10:17,992 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2025-10-14T01:10:17,993 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=314, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@1f13b49a{/.well-known,null,STOPPED} 2025-10-14T01:10:17,994 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@1f13b49a{/.well-known,null,STOPPED} 2025-10-14T01:10:17,994 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-10-14T01:10:17,994 | INFO | Blueprint Extender: 1 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@7402de60 2025-10-14T01:10:17,994 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]}", size=2} 2025-10-14T01:10:17,994 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-14T01:10:17,994 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-14T01:10:17,995 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-14T01:10:17,995 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=314, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2025-10-14T01:10:17,995 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@1f13b49a{/.well-known,null,AVAILABLE} 2025-10-14T01:10:17,995 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=314, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2025-10-14T01:10:17,996 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-14T01:10:17,999 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-10-14T01:10:17,999 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]}", size=1} 2025-10-14T01:10:17,999 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-10-14T01:10:18,001 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1278661122], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-14T01:10:18,001 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1278661122], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-14T01:10:18,002 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1278661122], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 919.4 μs 2025-10-14T01:10:18,023 | INFO | Blueprint Extender: 3 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2025-10-14T01:10:18,023 | INFO | Blueprint Extender: 3 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@7650d4b3 2025-10-14T01:10:18,024 | INFO | Blueprint Extender: 3 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@4303c177 2025-10-14T01:10:18,029 | INFO | Blueprint Extender: 3 | OnfExtensionProvider | 308 - org.opendaylight.openflowplugin.extension-onf - 0.20.1 | ONF Extension Provider started. 2025-10-14T01:10:18,030 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 has been started 2025-10-14T01:10:18,031 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.20.1 [309] was successfully created 2025-10-14T01:10:18,033 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2025-10-14T01:10:18,034 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2025-10-14T01:10:18,034 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2025-10-14T01:10:18,035 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2025-10-14T01:10:18,067 | INFO | Blueprint Extender: 1 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2025-10-14T01:10:18,068 | INFO | Blueprint Extender: 1 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2025-10-14T01:10:18,098 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:10:18,098 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:10:18,112 | INFO | Blueprint Extender: 1 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 9.0.1 | Global RESTCONF northbound pools started 2025-10-14T01:10:18,113 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 has been started 2025-10-14T01:10:18,114 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.21.2 [172] was successfully created 2025-10-14T01:10:18,197 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: Store Tx member-2-datastore-operational-fe-1-txn-3-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-10-14T01:10:18,747 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos: Elapsed time 19s, remaining time 280s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=397, STOPPING=0, FAILURE=0} 2025-10-14T01:10:18,747 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2025-10-14T01:10:18,747 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | Now notifying all its registered SystemReadyListeners... 2025-10-14T01:10:18,747 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | onSystemBootReady() received, starting the switch connections 2025-10-14T01:10:18,863 | INFO | epollEventLoopGroup-2-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2025-10-14T01:10:18,864 | INFO | epollEventLoopGroup-4-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2025-10-14T01:10:18,865 | INFO | epollEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2025-10-14T01:10:18,865 | INFO | epollEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2025-10-14T01:10:18,865 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@4303c177 started 2025-10-14T01:10:18,865 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@7650d4b3 started 2025-10-14T01:10:18,866 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | All switchConnectionProviders are up and running (2). 2025-10-14T01:10:25,739 | INFO | qtp617931180-337 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication is now enabled 2025-10-14T01:10:25,740 | INFO | qtp617931180-337 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication Manager activated 2025-10-14T01:10:33,738 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=40009, lastAppliedTerm=4, lastIndex=62283, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=22274, mandatoryTrim=false] 2025-10-14T01:10:33,740 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=40009, term=4]/EntryInfo[index=62283, term=4] 2025-10-14T01:10:33,741 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 40009 and term: 4 2025-10-14T01:10:35,361 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: snapshot is durable as of 2025-10-14T01:10:33.741351189Z 2025-10-14T01:10:48,163 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=40009, lastAppliedTerm=4, lastIndex=83193, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=43184, mandatoryTrim=false] 2025-10-14T01:10:48,165 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=40009, term=4]/EntryInfo[index=83193, term=4] 2025-10-14T01:10:48,166 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 40009 and term: 4 2025-10-14T01:10:49,467 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: snapshot is durable as of 2025-10-14T01:10:48.165457181Z 2025-10-14T01:11:00,549 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, nanosAgo=46751712548, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2} 2025-10-14T01:11:01,168 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=40009, lastAppliedTerm=4, lastIndex=100618, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=60609, mandatoryTrim=false] 2025-10-14T01:11:01,168 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=40009, term=4]/EntryInfo[index=100618, term=4] 2025-10-14T01:11:01,169 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 40009 and term: 4 2025-10-14T01:11:01,419 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Missing index 100619 from log. Cannot apply state. Ignoring 100619 to 112612 2025-10-14T01:11:01,420 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: snapshot is durable as of 2025-10-14T01:11:01.168969577Z 2025-10-14T01:11:01,810 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: Store Tx member-3-datastore-operational-fe-2-txn-3-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-10-14T01:11:02,979 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Missing index 104104 from log. Cannot apply state. Ignoring 104104 to 112612 2025-10-14T01:11:05,800 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Missing index 107589 from log. Cannot apply state. Ignoring 107589 to 112612 2025-10-14T01:11:08,074 | INFO | qtp617931180-337 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-14T01:11:08,078 | INFO | qtp617931180-337 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-14T01:11:08,116 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Missing index 111074 from log. Cannot apply state. Ignoring 111074 to 112612 2025-10-14T01:11:08,327 | INFO | qtp617931180-337 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 9.0.1 | Consecutive slashes in REST URLs will be rejected 2025-10-14T01:11:10,015 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2025-10-14T01:11:10,015 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2025-10-14T01:11:12,016 | INFO | sshd-SshServer[3c7da783](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.171.102:33674 authenticated 2025-10-14T01:11:12,537 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart 2025-10-14T01:17:06,125 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1 2025-10-14T01:17:06,736 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart 2025-10-14T01:17:07,248 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1 2025-10-14T01:17:07,712 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1 2025-10-14T01:17:08,146 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster 2025-10-14T01:17:08,559 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart 2025-10-14T01:17:09,768 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader 2025-10-14T01:17:12,951 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader 2025-10-14T01:17:13,434 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart 2025-10-14T01:17:13,434 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:17:13,754 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:17:13,919 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart 2025-10-14T01:17:14,113 | INFO | opendaylight-cluster-data-notification-dispatcher-45 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-14T01:17:14,175 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#1630754424], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=topology, dataTree=present} 2025-10-14T01:17:14,176 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#1630754424], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=topology, dataTree=present}} 2025-10-14T01:17:14,176 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#1630754424], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=topology, dataTree=present}} in 476.0 μs 2025-10-14T01:17:48,278 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.170.116:2550: 4461 millis 2025-10-14T01:17:48,281 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.161:2550: 4465 millis 2025-10-14T01:18:55,699 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node 2025-10-14T01:18:56,023 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL3 10.30.171.161" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL3 10.30.171.161 2025-10-14T01:18:59,919 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node 2025-10-14T01:19:00,718 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:19:00,718 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:19:01,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.161:2550, Up)]. 2025-10-14T01:19:04,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1278661122] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:19:04,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-topology-config#-1478094543] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:19:04,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#2023042842] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:19:04,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-toaster-operational#417466158] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:19:04,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1086354548] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:19:04,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-toaster-config#186604089] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:19:04,633 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:06,494 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.161:2550 is unreachable 2025-10-14T01:19:06,501 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate): Starting new election term 72 2025-10-14T01:19:06,502 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 72 2025-10-14T01:19:06,502 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@798f6939 2025-10-14T01:19:06,502 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Follower to Candidate 2025-10-14T01:19:06,503 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Follower to Candidate 2025-10-14T01:19:08,843 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Association to [pekko://opendaylight-cluster-data@10.30.171.161:2550] with UID [8236155373813206741] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-10-14T01:19:08,843 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:19:08,843 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:19:12,192 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:12,584 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node 2025-10-14T01:19:13,141 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: Association$OutboundStreamStopQuarantinedSignal$: 2025-10-14T01:19:13,594 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:13,751 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:13,827 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL3 10.30.171.161" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL3 10.30.171.161 2025-10-14T01:19:14,792 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:15,832 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:16,351 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:16,551 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate): Starting new election term 73 2025-10-14T01:19:16,877 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:17,392 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:17,911 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:18,430 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:18,952 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:19,471 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:19:20,256 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-516332942]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T01:19:20,256 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-516332942]] (version [1.0.3]) 2025-10-14T01:19:20,328 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.161:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2025-10-14T01:19:22,159 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:19:22,159 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:19:22,159 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T01:19:22,159 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T01:19:22,160 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T01:19:22,160 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T01:19:22,160 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T01:19:22,160 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T01:19:22,160 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T01:19:22,160 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T01:19:22,160 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T01:19:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T01:19:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T01:19:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T01:19:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T01:19:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T01:19:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T01:19:22,162 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T01:19:22,162 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T01:19:25,269 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-toaster-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28672, lastApplied : -1, commitIndex : -1 2025-10-14T01:19:25,269 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-toaster-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28673, lastApplied : -1, commitIndex : -1 2025-10-14T01:19:25,271 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=5, success=true, followerId=member-3-shard-topology-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28675, lastApplied : -1, commitIndex : -1 2025-10-14T01:19:25,284 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:19:25,284 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:19:25,289 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=6, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28693, lastApplied : 284, commitIndex : 284 2025-10-14T01:19:25,289 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28692, lastApplied : 40, commitIndex : 40 2025-10-14T01:19:25,289 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=6, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 283, snapshotTerm: 6, replicatedToAllIndex: 283 2025-10-14T01:19:25,289 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 39, snapshotTerm: 4, replicatedToAllIndex: 39 2025-10-14T01:19:25,289 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:19:25,290 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28693, lastApplied : 5, commitIndex : 5 2025-10-14T01:19:25,290 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): Initiating install snapshot to follower member-3-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 283, leader lastIndex: 284, leader log size: 1 2025-10-14T01:19:25,290 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): follower member-3-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:19:25,290 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=284, lastAppliedTerm=6, lastIndex=284, lastTerm=6, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-inventory-operational 2025-10-14T01:19:25,290 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 4, snapshotTerm: 4, replicatedToAllIndex: 4 2025-10-14T01:19:25,291 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): follower member-3-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:19:25,291 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): Initiating install snapshot to follower member-3-shard-topology-operational: follower nextIndex: 0, leader snapshotIndex: 4, leader lastIndex: 5, leader log size: 1 2025-10-14T01:19:25,291 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=5, lastAppliedTerm=4, lastIndex=5, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-topology-operational 2025-10-14T01:19:25,290 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): Initiating install snapshot to follower member-3-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 39, leader lastIndex: 40, leader log size: 1 2025-10-14T01:19:25,292 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=40, lastAppliedTerm=4, lastIndex=40, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-default-operational 2025-10-14T01:19:25,294 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Persising snapshot at EntryInfo[index=5, term=4]/EntryInfo[index=5, term=4] 2025-10-14T01:19:25,294 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 4 and term: 4 2025-10-14T01:19:25,295 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Persising snapshot at EntryInfo[index=40, term=4]/EntryInfo[index=40, term=4] 2025-10-14T01:19:25,296 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Persising snapshot at EntryInfo[index=284, term=6]/EntryInfo[index=284, term=6] 2025-10-14T01:19:25,296 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 39 and term: 4 2025-10-14T01:19:25,296 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 283 and term: 6 2025-10-14T01:19:25,298 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: snapshot is durable as of 2025-10-14T01:19:25.294431018Z 2025-10-14T01:19:25,299 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: snapshot is durable as of 2025-10-14T01:19:25.296607310Z 2025-10-14T01:19:25,302 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: snapshot is durable as of 2025-10-14T01:19:25.296399776Z 2025-10-14T01:19:25,367 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=6, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 283, snapshotTerm: 6, replicatedToAllIndex: 283 2025-10-14T01:19:25,367 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:19:25,368 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 39, snapshotTerm: 4, replicatedToAllIndex: 39 2025-10-14T01:19:25,368 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): follower member-3-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:19:25,369 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 4, snapshotTerm: 4, replicatedToAllIndex: 4 2025-10-14T01:19:25,369 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): follower member-3-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:19:25,416 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): Snapshot successfully installed on follower member-3-shard-topology-operational (last chunk 1) - matchIndex set to 5, nextIndex set to 6 2025-10-14T01:19:25,440 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-3-shard-inventory-operational (last chunk 1) - matchIndex set to 284, nextIndex set to 285 2025-10-14T01:19:25,447 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): Snapshot successfully installed on follower member-3-shard-default-operational (last chunk 1) - matchIndex set to 40, nextIndex set to 41 2025-10-14T01:19:25,446 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-default-config, logLastIndex=160, logLastTerm=4, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28850, lastApplied : 160, commitIndex : 160 2025-10-14T01:19:25,797 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-14T01:19:26,606 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate): Starting new election term 74 2025-10-14T01:19:26,615 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 74 2025-10-14T01:19:26,616 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Candidate to Leader 2025-10-14T01:19:26,616 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@784b1671 2025-10-14T01:19:26,616 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Candidate to Leader 2025-10-14T01:19:26,616 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T01:19:27,251 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, nanosAgo=97874066236, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=3} 2025-10-14T01:19:27,389 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=2}, nanosAgo=98012227846, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=3} 2025-10-14T01:19:28,155 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=2}, nanosAgo=33127502272, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=3} 2025-10-14T01:19:28,335 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: Store Tx member-3-datastore-operational-fe-3-txn-3-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-10-14T01:19:37,291 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart 2025-10-14T01:20:23,799 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.170.116:2550: 4001 millis 2025-10-14T01:20:23,881 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.161:2550: 4002 millis 2025-10-14T01:25:29,825 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader 2025-10-14T01:25:33,187 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart 2025-10-14T01:25:33,664 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:25:33,863 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:25:34,039 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, nanosAgo=367422614156, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=3} 2025-10-14T01:25:34,311 | INFO | opendaylight-cluster-data-notification-dispatcher-78 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-14T01:27:13,684 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart 2025-10-14T01:27:14,274 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:27:14,274 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:27:14,780 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-14T01:27:16,456 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node 2025-10-14T01:27:16,912 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart 2025-10-14T01:27:17,343 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart 2025-10-14T01:27:18,501 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2 2025-10-14T01:27:21,409 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2 2025-10-14T01:27:21,837 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart 2025-10-14T01:27:21,844 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:27:22,044 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:27:22,112 | INFO | opendaylight-cluster-data-notification-dispatcher-103 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-14T01:27:22,324 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart 2025-10-14T01:29:04,659 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2 2025-10-14T01:29:04,949 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL3 10.30.171.161" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL3 10.30.171.161 2025-10-14T01:29:08,929 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit 2025-10-14T01:29:09,715 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:29:09,716 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:29:10,764 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.161:2550, Up)]. 2025-10-14T01:29:13,214 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1086354548] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [372] dead letters encountered, of which 361 were not logged. The counter will be reset now. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,215 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-topology-config#-1478094543] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [1] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,215 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1959995188] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [2] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,215 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#1291882518] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,215 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-28 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-toaster-operational#417466158] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,216 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1959995188] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,216 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#1630754424] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,217 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-toaster-config#186604089] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,217 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#2023042842] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,217 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1086354548] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,217 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1278661122] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-14T01:29:13,239 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:16,928 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:29:16,928 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:29:16,929 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Association to [pekko://opendaylight-cluster-data@10.30.171.161:2550] with UID [6204537638301437607] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-10-14T01:29:20,510 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:21,031 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:21,462 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2 2025-10-14T01:29:21,551 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:21,714 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL3 10.30.171.161" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL3 10.30.171.161 2025-10-14T01:29:22,590 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:23,110 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:23,631 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:24,151 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:24,669 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:25,190 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:26,230 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:26,750 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-26 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.161:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.161/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-14T01:29:27,824 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-1575578741]] to [pekko://opendaylight-cluster-data@10.30.171.168:2550] 2025-10-14T01:29:27,824 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.171.168:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.161:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-1575578741]] (version [1.0.3]) 2025-10-14T01:29:27,894 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.171.168:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.161:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2025-10-14T01:29:29,065 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T01:29:29,065 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.161:2550 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T01:29:29,066 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-14T01:29:29,067 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-14T01:29:29,067 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T01:29:29,067 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-14T01:29:29,067 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-14T01:29:29,067 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T01:29:29,067 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.161:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-14T01:29:29,067 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-14T01:29:32,694 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:29:32,694 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:29:32,794 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=5, success=true, followerId=member-3-shard-topology-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27339, lastApplied : -1, commitIndex : -1 2025-10-14T01:29:32,794 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-toaster-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27249, lastApplied : -1, commitIndex : -1 2025-10-14T01:29:32,794 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-toaster-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-toaster-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27339, lastApplied : -1, commitIndex : -1 2025-10-14T01:29:32,979 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27523, lastApplied : 56, commitIndex : 56 2025-10-14T01:29:32,979 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 55, snapshotTerm: 4, replicatedToAllIndex: 55 2025-10-14T01:29:32,979 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): follower member-3-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:29:32,979 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): Initiating install snapshot to follower member-3-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 55, leader lastIndex: 56, leader log size: 1 2025-10-14T01:29:32,980 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=56, lastAppliedTerm=4, lastIndex=56, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-default-operational 2025-10-14T01:29:32,982 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Persising snapshot at EntryInfo[index=56, term=4]/EntryInfo[index=56, term=4] 2025-10-14T01:29:32,982 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 55 and term: 4 2025-10-14T01:29:32,984 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 55, snapshotTerm: 4, replicatedToAllIndex: 55 2025-10-14T01:29:32,985 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): follower member-3-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:29:32,987 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational: snapshot is durable as of 2025-10-14T01:29:32.982923332Z 2025-10-14T01:29:33,007 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=6, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 28002, lastApplied : 892, commitIndex : 892 2025-10-14T01:29:33,007 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27561, lastApplied : 23, commitIndex : 23 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=6, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 891, snapshotTerm: 6, replicatedToAllIndex: 891 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 22, snapshotTerm: 4, replicatedToAllIndex: 22 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): follower member-3-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): Initiating install snapshot to follower member-3-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 891, leader lastIndex: 892, leader log size: 1 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=892, lastAppliedTerm=6, lastIndex=892, lastTerm=6, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-inventory-operational 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): Initiating install snapshot to follower member-3-shard-topology-operational: follower nextIndex: 0, leader snapshotIndex: 22, leader lastIndex: 23, leader log size: 1 2025-10-14T01:29:33,007 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=23, lastAppliedTerm=4, lastIndex=23, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-topology-operational 2025-10-14T01:29:33,008 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Persising snapshot at EntryInfo[index=23, term=4]/EntryInfo[index=23, term=4] 2025-10-14T01:29:33,009 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 22 and term: 4 2025-10-14T01:29:33,009 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Persising snapshot at EntryInfo[index=892, term=6]/EntryInfo[index=892, term=6] 2025-10-14T01:29:33,009 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 891 and term: 6 2025-10-14T01:29:33,011 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 22, snapshotTerm: 4, replicatedToAllIndex: 22 2025-10-14T01:29:33,012 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): follower member-3-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:29:33,012 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=6, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 891, snapshotTerm: 6, replicatedToAllIndex: 891 2025-10-14T01:29:33,012 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-default-config, logLastIndex=182, logLastTerm=4, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27557, lastApplied : 182, commitIndex : 182 2025-10-14T01:29:33,012 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-14T01:29:33,012 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational: snapshot is durable as of 2025-10-14T01:29:33.009056855Z 2025-10-14T01:29:33,013 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational: snapshot is durable as of 2025-10-14T01:29:33.009582656Z 2025-10-14T01:29:33,040 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-topology-operational (Leader): Snapshot successfully installed on follower member-3-shard-topology-operational (last chunk 1) - matchIndex set to 23, nextIndex set to 24 2025-10-14T01:29:33,107 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-3-shard-inventory-operational (last chunk 1) - matchIndex set to 892, nextIndex set to 893 2025-10-14T01:29:33,123 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-default-operational (Leader): Snapshot successfully installed on follower member-3-shard-default-operational (last chunk 1) - matchIndex set to 56, nextIndex set to 57 2025-10-14T01:29:33,200 | INFO | node-cleaner-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-14T01:29:34,787 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=74, success=true, followerId=member-3-shard-inventory-config, logLastIndex=112624, logLastTerm=74, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 29242, lastApplied : 112624, commitIndex : 112624 2025-10-14T01:29:34,889 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=3}, nanosAgo=606572179139, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=4} 2025-10-14T01:29:35,144 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=3}, nanosAgo=136746402922, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=4} 2025-10-14T01:29:35,677 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=3}, nanosAgo=30676309840, purgedHistories=MutableUnsignedLongSet{span=[5..5], size=1}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=4} 2025-10-14T01:29:35,871 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-default-operational: Store Tx member-3-datastore-operational-fe-4-txn-3-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-10-14T01:29:44,803 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart 2025-10-14T01:35:37,310 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2 2025-10-14T01:35:40,542 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart 2025-10-14T01:35:41,164 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:35:41,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-14T01:35:41,509 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=3}, nanosAgo=499363769233, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=4} 2025-10-14T01:35:41,865 | INFO | opendaylight-cluster-data-notification-dispatcher-179 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-14T01:37:21,046 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 2025-10-14T01:37:21,543 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:37:21,544 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-14T01:37:22,049 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-14T01:37:23,707 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2 2025-10-14T01:37:24,147 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart 2025-10-14T01:37:26,626 | INFO | sshd-SshServer[3c7da783](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.171.102:51204 authenticated 2025-10-14T01:37:27,491 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot 2025-10-14T01:37:27,896 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory 2025-10-14T01:37:32,915 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification 2025-10-14T01:37:33,352 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower 2025-10-14T01:37:33,844 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, nanosAgo=1087227458393, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1} 2025-10-14T01:37:34,906 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.lang.IllegalArgumentException: newPosition > limit: (15476419 > 5742013) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] 2025-10-14T01:37:34,952 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | newPosition > limit: (15476419 > 5742013) 2025-10-14T01:37:50,254 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15349 ms in state COMMIT_PENDING 2025-10-14T01:37:50,255 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:38:05,313 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:38:05,314 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:38:20,373 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:38:20,373 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:38:50,493 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16602 ms in state COMMIT_PENDING 2025-10-14T01:38:50,494 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:39:05,543 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:39:05,543 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:39:20,583 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:39:20,584 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:39:50,673 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16743 ms in state COMMIT_PENDING 2025-10-14T01:39:50,673 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:40:03,956 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (15476419 > 5742013) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T01:40:03,958 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T01:40:05,713 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:40:05,714 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:40:20,763 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:40:20,763 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:40:33,974 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (15476419 > 5742013) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T01:40:33,976 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T01:40:36,246 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Leader): Term 75 in "RequestVote{term=75, candidateId=member-2-shard-inventory-config, lastLogIndex=112630, lastLogTerm=74}" message is greater than leader's term 74 - switching to Follower 2025-10-14T01:40:36,252 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Leader) :- Switching from behavior Leader to Follower, election term: 75 2025-10-14T01:40:36,253 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@5a029b8c 2025-10-14T01:40:36,253 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Leader to Follower 2025-10-14T01:40:36,253 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Leader to Follower 2025-10-14T01:40:36,272 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2981c08 2025-10-14T01:40:36,273 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-14T01:40:36,273 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2025-10-14T01:40:36,282 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2025-10-14T01:40:36,299 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config (Follower): Snapshot received from leader: member-2-shard-inventory-config 2025-10-14T01:40:36,300 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: Applying snapshot on follower: PlainSnapshotSource{io=MemoryStreamSource{size=539}} 2025-10-14T01:40:36,307 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-1-shard-inventory-config: snapshot is durable as of 2025-10-14T01:40:36.302635299Z 2025-10-14T01:40:36,307 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardSnapshotCohort | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Applying snapshot 2025-10-14T01:40:36,308 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: applying state snapshot with pending transactions 2025-10-14T01:40:36,309 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardSnapshotCohort | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Done applying snapshot 2025-10-14T01:40:36,310 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (15476419 > 5742013) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T01:40:36,310 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T01:40:37,336 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (15476419 > 5742013) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T01:40:37,337 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T01:40:37,340 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (15476419 > 5742013) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-14T01:40:37,341 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-14T01:40:50,874 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16903 ms in state COMMIT_PENDING 2025-10-14T01:40:50,874 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:41:05,933 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:41:05,934 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:41:20,983 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:41:20,984 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:41:36,033 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:41:36,034 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:41:51,074 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:41:51,074 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:42:06,133 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:42:06,134 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:42:21,193 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:42:21,194 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:42:36,253 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:42:36,254 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:42:51,303 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:42:51,304 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:43:06,353 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:43:06,354 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:43:21,413 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:43:21,414 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:43:36,463 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:43:36,463 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:43:51,514 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:43:51,514 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:44:06,573 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:44:06,573 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:44:14,226 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster 2025-10-14T01:44:14,572 | INFO | qtp617931180-512 | StaticConfiguration | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding-over-DOM codec shortcuts are enabled 2025-10-14T01:44:14,611 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:15,628 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:16,648 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:17,669 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:18,689 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:19,708 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:20,728 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:21,623 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:44:21,624 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:44:21,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:22,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:23,788 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:24,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:25,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:26,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:27,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:28,888 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:29,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:30,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:31,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:32,968 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:33,988 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:35,012 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:36,028 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:36,684 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:44:36,684 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:44:37,048 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:38,068 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:39,088 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:40,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:41,128 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:42,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:43,169 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:44,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:45,208 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:46,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:47,249 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:48,269 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:49,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:50,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:51,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:51,743 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:44:51,744 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:44:52,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:53,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:54,388 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:55,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:56,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:57,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:58,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:44:59,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:00,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:01,528 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:02,548 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:03,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:04,588 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:05,608 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:06,628 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:06,803 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:45:06,804 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:45:07,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:08,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:09,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:10,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:11,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:12,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:13,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:14,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:15,809 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:16,828 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:17,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:18,869 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:19,888 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:20,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:21,843 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:45:21,844 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:45:21,930 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:22,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:23,968 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:24,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:26,008 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:27,028 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:28,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:29,069 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:30,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:31,108 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:32,128 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:33,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:34,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:35,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:36,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:36,893 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:45:36,893 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:45:37,229 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:38,248 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:39,268 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:40,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:41,308 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:42,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:43,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:44,369 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:45,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:46,408 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:47,428 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:48,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:49,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:50,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:51,508 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:51,943 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:45:51,943 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:45:52,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:53,548 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:54,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:55,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:56,608 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:57,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:58,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:45:59,668 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:00,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:01,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:02,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:03,749 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:04,768 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:05,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:06,808 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:06,983 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15039 ms in state COMMIT_PENDING 2025-10-14T01:46:06,984 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:46:07,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:08,848 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:09,869 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:10,888 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:11,908 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:12,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:13,948 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:14,628 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-11-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.023355571 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-11-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.023355571 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-11-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.023355571 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:14,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:15,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:17,008 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:18,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:19,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:20,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:21,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:22,033 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:46:22,034 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:46:22,108 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:23,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:24,149 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:25,166 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:26,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:27,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:28,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:29,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:30,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:31,288 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:32,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:33,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:34,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:35,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:36,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:37,093 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:46:37,094 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:46:37,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:38,428 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:39,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:40,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:41,488 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:42,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:43,528 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:44,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:45,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:46,588 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:47,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:48,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:49,648 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:50,668 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:51,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:52,153 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:46:52,155 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:46:52,709 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:53,728 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:54,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:55,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:56,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:57,808 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:58,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:46:59,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:00,868 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:01,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:02,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:03,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:04,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:05,969 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:06,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:07,213 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:47:07,213 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:47:08,009 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:09,028 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:10,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:11,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:12,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:13,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:14,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:15,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:16,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:17,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:18,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:19,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:20,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:21,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:22,273 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:47:22,273 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:47:22,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:23,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:24,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:25,348 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:26,368 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:27,395 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:28,417 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:29,437 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:30,457 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:31,476 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:32,497 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:33,520 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:34,538 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:35,557 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:36,577 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:37,333 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:47:37,333 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:47:37,597 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:38,620 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:39,637 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:40,657 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:41,680 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:42,696 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:43,717 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:44,737 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:45,757 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:46,778 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:47,797 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:48,817 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:49,837 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:50,860 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:51,879 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:52,373 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:47:52,373 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:47:52,899 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:53,918 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:54,937 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:55,963 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:56,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:58,008 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:47:59,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:00,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:01,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:02,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:03,108 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:04,128 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:05,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:06,168 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:07,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:07,423 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:48:07,424 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:48:08,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:09,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:10,249 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:11,269 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:12,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:13,308 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:14,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:14,654 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-12-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.023201507 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-12-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.023201507 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-12-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.023201507 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:15,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:16,368 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:17,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:18,408 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:19,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:20,448 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:21,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:22,483 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:48:22,484 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:48:22,488 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:23,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:24,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:25,548 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:26,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:27,591 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:28,609 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:29,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:30,648 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:31,668 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:32,688 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:33,708 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:34,728 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:35,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:36,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:37,543 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:48:37,544 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:48:37,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:38,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:39,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:40,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:41,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:42,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:43,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:44,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:45,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:46,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:47,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:49,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:50,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:51,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:52,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:52,593 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:48:52,593 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:48:53,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:54,108 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:55,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:56,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:57,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:58,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:48:59,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:00,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:01,248 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:02,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:03,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:04,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:05,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:06,346 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:07,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:07,643 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:49:07,644 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:49:08,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:09,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:10,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:11,446 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:12,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:13,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:14,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:15,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:16,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:17,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:18,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:19,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:20,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:21,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:22,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:22,693 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:49:22,694 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:49:23,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:24,708 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:25,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:26,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:27,768 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:28,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:29,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:30,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:31,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:32,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:33,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:34,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:35,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:36,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:37,753 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:49:37,753 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:49:37,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:38,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:40,008 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:41,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:42,048 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:43,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:44,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:45,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:46,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:47,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:48,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:49,186 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:50,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:51,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:52,248 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:52,813 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:49:52,813 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:49:53,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:54,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:55,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:56,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:57,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:58,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:49:59,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:00,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:01,428 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:02,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:03,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:04,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:05,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:06,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:07,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:07,873 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:50:07,873 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:50:08,568 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:09,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:10,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:11,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:12,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:13,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:14,684 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-13-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026052141 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-13-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026052141 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-13-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026052141 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:14,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:15,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:16,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:17,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:18,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:19,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:20,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:21,828 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:22,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:22,914 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:50:22,914 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:50:23,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:24,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:25,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:26,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:27,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:28,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:29,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:31,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:32,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:33,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:34,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:35,088 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:36,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:37,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:37,963 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:50:37,963 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:50:38,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:39,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:40,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:41,208 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:42,228 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:43,246 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:44,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:45,288 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:46,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:47,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:48,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:49,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:50,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:51,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:52,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:53,013 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:50:53,014 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:50:53,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:54,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:55,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:55,794 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow 2025-10-14T01:50:56,296 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow 2025-10-14T01:50:56,508 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:56,722 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations 2025-10-14T01:50:57,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:58,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:50:59,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:00,588 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:01,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:02,628 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:03,649 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:04,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:05,688 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:06,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:07,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:08,073 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:51:08,073 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:51:08,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:09,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:10,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:11,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:12,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:13,849 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:14,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:15,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:16,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:17,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:18,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:19,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:20,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:22,009 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:23,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:23,133 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:51:23,134 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:51:24,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:25,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:26,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:27,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:28,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:29,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:30,168 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:31,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:32,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:33,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:34,249 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:35,269 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:36,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:37,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:38,193 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:51:38,193 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:51:38,329 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:39,346 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:40,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:41,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:42,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:43,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:44,446 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:45,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:46,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:47,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:48,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:49,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:50,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:51,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:52,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:53,253 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:51:53,254 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:51:53,628 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:54,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:55,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:56,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:57,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:58,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:51:59,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:00,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:01,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:02,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:03,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:04,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:05,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:06,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:07,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:08,313 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:52:08,314 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:52:08,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:09,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:10,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:11,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:13,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:14,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:14,715 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-14-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.026681186 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-14-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.026681186 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-14-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.026681186 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:15,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:16,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:17,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:18,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:19,129 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:20,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:21,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:22,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:23,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:23,373 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:52:23,373 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:52:24,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:25,248 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:26,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:27,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:28,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:29,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:30,348 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:31,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:32,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:33,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:34,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:35,446 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:36,466 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:37,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:38,433 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:52:38,434 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:52:38,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:39,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:40,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:41,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:42,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:43,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:44,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:45,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:46,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:47,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:48,708 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:49,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:50,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:51,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:52,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:53,483 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:52:53,483 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:52:53,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:54,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:55,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:56,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:57,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:58,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:52:59,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:00,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:01,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:02,989 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:04,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:05,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:06,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:07,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:08,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:08,533 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:53:08,533 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:53:09,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:10,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:11,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:12,169 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:13,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:14,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:15,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:16,246 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:17,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:18,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:19,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:20,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:21,348 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:22,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:23,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:23,594 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:53:23,594 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:53:24,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:25,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:26,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:27,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:28,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:29,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:30,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:31,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:32,568 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:33,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:34,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:35,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:36,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:37,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:38,653 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:53:38,653 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:53:38,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:39,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:40,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:41,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:42,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:43,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:44,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:45,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:46,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:47,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:48,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:49,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:50,928 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:51,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:52,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:53,703 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:53:53,703 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:53:53,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:55,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:56,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:57,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:58,069 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:53:59,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:00,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:01,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:02,146 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:03,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:04,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:05,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:06,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:07,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:08,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:08,763 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:54:08,764 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:54:09,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:10,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:11,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:12,348 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:13,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:14,385 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:14,724 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-15-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.005777458 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-15-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.005777458 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-15-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.005777458 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:15,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:16,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:17,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:18,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:19,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:20,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:21,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:22,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:23,569 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:23,813 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:54:23,814 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:54:24,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:25,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:26,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:27,648 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:28,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:29,686 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:30,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:31,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:32,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:33,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:34,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:35,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:36,825 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:37,848 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:38,863 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:54:38,863 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:54:38,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:39,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:40,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:41,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:42,948 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:43,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:44,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:46,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:47,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:48,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:49,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:50,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:51,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:52,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:53,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:53,923 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:54:53,923 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:54:54,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:55,188 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:56,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:57,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:58,246 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:54:59,266 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:00,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:01,308 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:02,328 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:03,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:04,368 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:05,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:06,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:07,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:08,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:08,984 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:55:08,984 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:55:09,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:10,488 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:11,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:12,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:13,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:14,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:15,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:16,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:17,628 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:18,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:19,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:20,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:21,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:22,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:23,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:24,043 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:55:24,043 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:55:24,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:25,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:26,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:27,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:28,848 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:29,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:30,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:31,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:32,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:33,948 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:34,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:35,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:37,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:38,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:39,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:39,093 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:55:39,093 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:55:40,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:41,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:42,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:43,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:44,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:45,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:46,186 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:47,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:48,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:49,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:50,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:51,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:52,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:53,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:54,133 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:55:54,133 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:55:54,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:55,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:56,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:57,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:58,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:55:59,464 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:00,488 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:01,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:02,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:03,548 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:04,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:05,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:06,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:07,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:08,648 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:09,193 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:56:09,194 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:56:09,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:10,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:11,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:12,730 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:13,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:14,744 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-16-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.017334788 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-16-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.017334788 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-16-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.017334788 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:14,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:15,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:16,808 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:17,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:18,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:19,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:20,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:21,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:22,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:23,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:24,253 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:56:24,254 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:56:24,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:25,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:27,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:28,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:29,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:30,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:31,088 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:32,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:33,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:34,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:35,168 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:36,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:37,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:38,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:39,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:39,313 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:56:39,314 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:56:40,268 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:41,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:42,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:43,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:44,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:45,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:46,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:47,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:48,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:49,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:50,466 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:51,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:52,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:53,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:54,373 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:56:54,373 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:56:54,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:55,568 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:56,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:57,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:58,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:56:59,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:00,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:01,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:02,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:03,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:04,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:05,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:06,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:07,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:08,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:09,433 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:57:09,433 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:57:09,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:10,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:11,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:12,908 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:13,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:14,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:15,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:16,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:18,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:19,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:20,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:21,068 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:22,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:23,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:24,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:24,483 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:57:24,483 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:57:25,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:26,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:27,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:28,208 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:29,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:30,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:31,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:32,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:33,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:34,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:35,346 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:36,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:37,329 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations 2025-10-14T01:57:37,388 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:37,896 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations 2025-10-14T01:57:38,399 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node 2025-10-14T01:57:38,409 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:39,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:39,543 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:57:39,544 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:57:40,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:41,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:42,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:43,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:44,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:45,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:46,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:47,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:48,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:49,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:50,648 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:51,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:52,686 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:53,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:54,603 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T01:57:54,604 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:57:54,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:55,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:56,769 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:57,788 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:58,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:57:59,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:00,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:01,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:02,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:03,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:04,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:05,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:06,965 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:07,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:09,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:09,653 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:58:09,654 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:58:10,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:11,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:12,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:13,085 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:14,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:14,775 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-17-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.027484787 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-17-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.027484787 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-17-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.027484787 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:15,125 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:16,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:17,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:18,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:19,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:20,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:21,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:22,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:23,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:24,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:24,694 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:58:24,694 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:58:25,328 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:26,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:27,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:28,388 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:29,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:30,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:31,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:32,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:33,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:34,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:35,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:36,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:37,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:38,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:39,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:39,743 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:58:39,743 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:58:40,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:41,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:42,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:43,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:44,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:45,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:46,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:47,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:48,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:49,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:50,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:51,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:52,868 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:53,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:54,783 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:58:54,784 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:58:54,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:55,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:56,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:57,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:58:58,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:00,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:01,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:02,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:03,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:04,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:05,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:06,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:07,146 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:08,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:09,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:09,823 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T01:59:09,824 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:59:10,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:11,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:12,246 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:13,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:14,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:15,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:16,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:17,348 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:18,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:19,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:20,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:21,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:22,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:23,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:24,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:24,873 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T01:59:24,874 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:59:25,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:26,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:27,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:28,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:29,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:30,608 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:31,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:32,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:33,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:34,686 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:35,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:36,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:37,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:38,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:39,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:39,924 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T01:59:39,924 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:59:40,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:41,828 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:42,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:43,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:44,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:45,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:46,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:47,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:48,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:49,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:51,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:52,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:53,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:54,068 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:54,984 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T01:59:54,984 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T01:59:55,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:56,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:57,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:58,146 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T01:59:59,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:00,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:01,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:02,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:03,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:04,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:05,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:06,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:07,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:08,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:09,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:10,033 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:00:10,033 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:00:10,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:11,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:12,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:13,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:14,466 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:14,804 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-18-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.025304054 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-18-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.025304054 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-18-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.025304054 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:15,488 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:16,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:17,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:18,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:19,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:20,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:21,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:22,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:23,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:24,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:25,093 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:00:25,093 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:00:25,686 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:26,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:27,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:28,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:29,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:30,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:31,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:32,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:33,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:34,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:35,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:36,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:37,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:38,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:39,968 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:40,133 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15039 ms in state COMMIT_PENDING 2025-10-14T02:00:40,133 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:00:40,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:42,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:43,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:44,048 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:45,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:46,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:47,108 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:48,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:49,148 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:50,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:51,188 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:52,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:53,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:54,246 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:55,174 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T02:00:55,174 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:00:55,266 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:56,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:57,308 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:58,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:00:59,348 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:00,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:01,388 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:02,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:03,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:04,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:05,466 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:06,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:07,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:08,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:09,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:10,224 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:01:10,224 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:01:10,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:11,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:12,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:13,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:14,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:15,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:16,689 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:17,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:18,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:19,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:20,768 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:21,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:22,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:23,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:24,848 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:25,283 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:01:25,284 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:01:25,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:26,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:27,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:28,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:29,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:30,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:31,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:33,011 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:34,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:35,048 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:36,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:37,085 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:38,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:39,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:40,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:40,323 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15039 ms in state COMMIT_PENDING 2025-10-14T02:01:40,323 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:01:41,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:42,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:43,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:44,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:45,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:46,266 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:47,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:48,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:49,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:50,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:51,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:52,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:53,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:54,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:55,383 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:01:55,384 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:01:55,446 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:56,466 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:57,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:58,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:01:59,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:00,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:01,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:02,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:03,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:04,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:05,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:06,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:07,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:08,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:09,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:10,443 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:02:10,443 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:02:10,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:11,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:12,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:13,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:14,824 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-19-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.017646609 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-19-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.017646609 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-19-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.017646609 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:14,828 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:15,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:16,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:17,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:18,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:19,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:20,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:21,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:22,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:24,009 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:25,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:25,503 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:02:25,504 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:02:26,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:27,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:28,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:29,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:30,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:31,149 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:32,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:33,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:34,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:35,245 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:36,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:37,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:38,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:39,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:40,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:40,563 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:02:40,564 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:02:41,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:42,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:43,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:44,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:45,446 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:46,466 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:47,488 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:48,506 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:49,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:50,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:51,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:52,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:53,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:54,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:55,613 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T02:02:55,614 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:02:55,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:56,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:57,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:58,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:02:59,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:00,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:01,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:02,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:03,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:04,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:05,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:06,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:07,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:08,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:09,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:10,674 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:03:10,674 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:03:10,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:11,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:12,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:14,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:15,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:16,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:17,068 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:18,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:19,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:20,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:21,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:22,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:23,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:24,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:25,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:25,703 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15029 ms in state COMMIT_PENDING 2025-10-14T02:03:25,703 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:03:26,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:27,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:28,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:29,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:30,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:31,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:32,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:33,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:34,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:35,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:36,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:37,468 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:38,488 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:39,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:40,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:40,753 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:03:40,753 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:03:41,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:42,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:43,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:44,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:45,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:46,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:47,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:48,686 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:49,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:50,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:51,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:52,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:53,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:54,811 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:55,814 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:03:55,814 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:03:55,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:56,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:57,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:58,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:03:59,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:00,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:01,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:02,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:03,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:05,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:06,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:07,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:08,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:09,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:10,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:10,863 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:04:10,864 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:04:11,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:12,146 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:13,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:14,186 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:14,855 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-20-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027239331 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-20-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027239331 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-20-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027239331 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:15,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:16,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:17,248 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:18,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:19,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:19,338 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion 2025-10-14T02:04:20,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:21,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:22,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:23,365 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:24,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:25,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:25,923 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:04:25,923 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:04:26,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:27,446 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:28,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:29,489 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:30,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:31,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:32,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:33,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:34,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:35,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:36,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:37,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:38,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:39,688 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:40,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:40,983 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:04:40,984 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:04:41,729 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:42,748 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:43,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:44,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:45,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:46,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:47,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:48,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:49,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:50,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:51,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:52,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:53,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:54,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:56,008 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:56,033 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:04:56,033 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:04:57,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:58,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:04:59,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:00,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:01,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:02,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:03,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:04,166 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:05,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:06,206 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:07,225 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:08,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:09,266 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:10,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:11,083 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T02:05:11,084 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:05:11,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:12,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:13,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:14,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:15,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:16,407 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:17,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:18,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:19,466 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:20,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:21,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:22,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:23,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:24,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:25,588 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:26,144 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:05:26,144 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:05:26,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:27,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:28,648 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:29,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:30,686 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:31,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:32,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:33,747 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:34,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:35,785 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:36,810 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:37,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:38,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:39,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:40,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:41,203 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:05:41,203 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:05:41,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:42,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:43,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:44,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:45,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:47,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:48,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:49,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:50,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:51,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:52,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:53,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:54,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:55,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:56,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:56,263 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:05:56,263 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:05:57,218 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:58,238 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:05:59,258 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:00,280 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:01,299 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:02,317 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:03,336 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:04,358 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:05,376 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:06,396 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:07,417 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:08,436 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:09,457 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:10,477 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:11,323 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:06:11,323 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:06:11,496 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:12,517 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:13,536 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:14,556 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:14,885 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-21-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026379224 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-21-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026379224 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-21-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026379224 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:15,576 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:16,598 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:17,617 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:18,637 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:19,657 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:19,694 | ERROR | ForkJoinPool-11-worker-2 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-22-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.012436937 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-22-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.012436937 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-22-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.012436937 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:20,676 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:21,697 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:22,717 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:23,736 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:24,756 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:25,776 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:26,383 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:06:26,384 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:06:26,800 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:27,817 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:28,837 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:29,856 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:30,877 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:31,896 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:32,916 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:33,936 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:34,957 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:35,977 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:36,997 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:38,017 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:39,037 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:40,057 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:41,077 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:41,424 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T02:06:41,424 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:06:42,097 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:43,116 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:44,137 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:45,157 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:46,178 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:47,197 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:48,216 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:49,236 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:50,256 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:51,277 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:52,297 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:53,316 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:54,337 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:55,357 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:56,376 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:56,483 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:06:56,483 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:06:57,396 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:58,417 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:06:59,436 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:00,457 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:01,476 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:02,496 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:03,517 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:04,537 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:05,556 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:06,576 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:07,596 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:08,617 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:09,637 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:10,657 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:11,523 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T02:07:11,523 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:07:11,677 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:12,697 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:13,723 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:14,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:15,767 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:16,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:17,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:18,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:19,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:20,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:21,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:22,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:23,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:24,164 | INFO | sshd-SshServer[3c7da783](port=8101)-timer-thread-1 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Disconnecting(ServerSessionImpl[karaf@/10.30.171.102:33674]): SSH2_DISCONNECT_PROTOCOL_ERROR - Detected IdleTimeout after 1800007/1800000 ms. 2025-10-14T02:07:24,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:25,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:26,564 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T02:07:26,564 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:07:26,988 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:28,008 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:29,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:30,048 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:31,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:32,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:33,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:34,126 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:35,146 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:36,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:37,186 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:38,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:39,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:40,246 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:41,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:41,623 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:07:41,623 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:07:42,287 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:43,307 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:44,327 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:45,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:46,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:47,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:48,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:49,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:50,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:51,468 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:52,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:53,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:54,527 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:55,546 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:56,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:56,673 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T02:07:56,674 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:07:57,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:58,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:07:59,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:00,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:01,668 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:02,688 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:03,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:04,727 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:05,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:06,769 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:07,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:08,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:09,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:10,848 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:11,714 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-10-14T02:08:11,714 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:08:11,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:12,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:13,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:14,915 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-23-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.026650254 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-23-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.026650254 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-23-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.026650254 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:14,926 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:15,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:16,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:17,988 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:19,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:19,724 | ERROR | ForkJoinPool-11-worker-2 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-24-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026475882 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-24-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026475882 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-24-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026475882 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:20,028 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:21,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:22,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:23,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:24,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:25,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:26,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:26,763 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:08:26,763 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:08:27,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:28,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:29,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:30,227 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:31,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:34,004 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.161:2550: 2177 millis 2025-10-14T02:08:34,005 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.170.116:2550: 2179 millis 2025-10-14T02:08:34,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:35,294 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:36,317 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:37,336 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:38,357 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:39,377 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:40,396 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:41,416 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:41,813 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:08:41,813 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:08:42,437 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:43,457 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:44,476 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:45,497 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:46,525 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:47,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:48,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:49,587 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:50,608 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:51,626 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:52,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:53,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:54,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:55,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:56,728 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:56,873 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:08:56,873 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:08:57,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:58,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:08:59,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:00,807 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:01,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:02,845 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:03,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:04,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:05,907 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:06,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:07,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:08,967 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:09,987 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:11,007 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:11,933 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:09:11,933 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:09:12,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:13,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:14,066 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:15,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:16,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:17,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:18,146 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:19,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:20,187 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:21,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:22,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:23,246 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:24,266 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:25,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:26,308 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:26,983 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T02:09:26,983 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:09:27,326 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:28,346 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:29,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:30,386 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:31,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:32,428 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:33,446 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:34,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:35,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:36,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:37,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:38,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:39,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:40,588 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:41,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:42,044 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-10-14T02:09:42,044 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:09:42,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:43,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:44,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:45,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:46,706 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:47,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:48,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:49,766 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:50,786 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:51,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:52,826 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:53,846 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:54,866 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:55,887 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:56,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:57,093 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:09:57,094 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:09:57,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:58,947 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:09:59,966 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:00,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:02,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:03,026 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:04,047 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:05,068 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:06,086 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:07,106 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:08,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:09,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:10,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:11,188 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:12,143 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-14T02:10:12,144 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:10:12,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:13,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:14,247 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:14,935 | ERROR | ForkJoinPool-11-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-25-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.016798305 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-25-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.016798305 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-25-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.016798305 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:15,267 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:16,286 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:17,306 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:18,328 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:19,346 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:19,744 | ERROR | ForkJoinPool-11-worker-2 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-26-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.017012866 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-26-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.017012866 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-26-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#462904850], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.017012866 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:20,367 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:21,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:22,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:23,427 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:24,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:25,468 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:26,486 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:27,203 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-10-14T02:10:27,203 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:10:27,507 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:28,526 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:29,547 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:30,567 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:31,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:32,607 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:33,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:34,647 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:35,667 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:36,687 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:37,707 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:38,726 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:39,746 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:40,768 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:41,787 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:42,243 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15039 ms in state COMMIT_PENDING 2025-10-14T02:10:42,243 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:10:42,806 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:43,827 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:44,847 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:45,867 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:46,886 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:47,906 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:48,927 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:49,946 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:50,968 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:51,986 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:53,006 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:54,027 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:55,046 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:56,067 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:57,087 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:57,293 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-10-14T02:10:57,293 | WARN | opendaylight-cluster-data-shard-dispatcher-37 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:10:58,107 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:10:59,127 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:00,006 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification 2025-10-14T02:11:00,147 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:01,167 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:02,186 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:03,207 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:04,226 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:05,328 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:06,347 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:07,366 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:08,387 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:09,406 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:10,426 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:11,447 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:12,354 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Current transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15061 ms in state COMMIT_PENDING 2025-10-14T02:11:12,354 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-1-shard-inventory-config: Transaction member-2-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-10-14T02:11:12,467 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:13,487 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:14,549 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:15,566 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:16,586 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:17,606 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:18,627 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:19,646 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-14T02:11:20,666 | WARN | ForkJoinPool.commonPool-worker-3 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1011266636] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more